Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,080)

Search Parameters:
Keywords = IEEE standards

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 2210 KB  
Article
Extreme Fast Charging Station for Multiple Vehicles with Sinusoidal Currents at the Grid Side and SiC-Based dc/dc Converters
by Dener A. de L. Brandao, Thiago M. Parreiras, Igor A. Pires and Braz J. Cardoso Filho
World Electr. Veh. J. 2026, 17(4), 215; https://doi.org/10.3390/wevj17040215 (registering DOI) - 18 Apr 2026
Abstract
Extreme fast charging (XFC) infrastructure is becoming increasingly necessary as the number of electric vehicles continues to grow. However, deploying such stations introduces several challenges related to power quality and compliance with regulatory standards. This work presents an alternative XFC station designed for [...] Read more.
Extreme fast charging (XFC) infrastructure is becoming increasingly necessary as the number of electric vehicles continues to grow. However, deploying such stations introduces several challenges related to power quality and compliance with regulatory standards. This work presents an alternative XFC station designed for charging multiple vehicles while ensuring low harmonic distortion in the grid currents, without the need for sinusoidal filters, by employing the Zero Harmonic Distortion (ZHD) converter. The proposed system offers galvanic isolation for each charging interface and supports additional functionalities, including the integration of Distributed Energy Resources (DERs) and the provision of ancillary services. These features are enabled through the combination of a bidirectional grid-connected active front-end operating at low switching frequency with high-frequency silicon carbide (SiC)-based dc/dc converters on the vehicle side. Hardware-in-the-loop (HIL) simulation results demonstrate a total demand distortion (TDD) of 1.12% for charging scenarios involving both 400 V and 800 V battery systems, remaining within the limits specified by IEEE 519-2022. Full article
(This article belongs to the Special Issue Power and Energy Systems for E-Mobility, 2nd Edition)
43 pages, 3418 KB  
Systematic Review
IEC 61850 GOOSE: A Systematic Literature Review on the State of the Art and Current Applications
by Arthur Kniphoff da Cruz, Ana Clara Hackenhaar Kellermann, Ingridy Caroliny da Silva, Jaine Mercia Fernandes de Oliveira, Marcia Elena Jochims Kniphoff da Cruz and Lorenz Däubler
Automation 2026, 7(2), 62; https://doi.org/10.3390/automation7020062 - 17 Apr 2026
Abstract
To develop secure, fast, and interoperable smart substations, it is vital to understand the current situation and potential future directions of the technologies involved. This study presents the evolution and state of the art of the Generic Object Oriented Substation Event (GOOSE) communication [...] Read more.
To develop secure, fast, and interoperable smart substations, it is vital to understand the current situation and potential future directions of the technologies involved. This study presents the evolution and state of the art of the Generic Object Oriented Substation Event (GOOSE) communication protocol, defined by the International Electrotechnical Commission (IEC) 61850 standard. A Systematic Literature Review (SLR) was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) protocol. This included journal articles published from 2004 to 2025 and conference papers from 2020 to 2025, written in English within Engineering. Only studies primarily focusing on GOOSE, citing it at least ten times, and indexed in the Scopus, IEEE Xplore, and Web of Science databases were included. The quantitative analysis used SciMAT software, complemented by a qualitative analysis. Due to the bibliometric and thematic nature of this review, potential biases were considered at the review level rather than by applying a formal study-level risk-of-bias tool. The final analysis comprised 82 journal articles and 84 conference papers. The results offer a comprehensive mapping of GOOSE research evolution, identify nine main challenges and limitations from the last 22 years, and highlight current research directions. The literature reveals methodological heterogeneity, a predominance of simulation-based approaches, and limited large-scale empirical validation. Full article
(This article belongs to the Special Issue Substation Automation, Protection and Control Based on IEC 61850)
30 pages, 5697 KB  
Article
Petri-Net-Based Interlocking and Supervisory Logic for Tap-Changer-Assisted Transformers: A Formalized Control Approach
by Alfonso Montenegro and Luis Tipán
Energies 2026, 19(8), 1943; https://doi.org/10.3390/en19081943 - 17 Apr 2026
Abstract
The increasing operational variability in distribution networks (e.g., abrupt load changes and distributed generation integration) increases the demands on voltage regulation devices and, in particular, on transformers with on-load tap changers (OLTCs). This paper develops and validates a discrete supervisory control scheme based [...] Read more.
The increasing operational variability in distribution networks (e.g., abrupt load changes and distributed generation integration) increases the demands on voltage regulation devices and, in particular, on transformers with on-load tap changers (OLTCs). This paper develops and validates a discrete supervisory control scheme based on Petri nets, implemented in Stateflow and coupled to an electromagnetic model of the OLTC transformer in Simulink/Simscape. The Petri net formalizes the conditional and sequential logic of OLTC operation, enabling state- and time-dependent decisions (e.g., delays between maneuvers) to improve voltage regulation and reduce unnecessary tap operations. The evaluation is performed by simulation under transient scenarios that include sudden load variations anda phase-to-ground fault in the IEEE 13-node standard network, specifically at node 634. In the base case, the controller maintains the voltage within the tolerance band ±1.875% during 96% of the simulated time, with an 88% reduction in RMS error (from 1.92% to 0.23%) and 100% operational efficiency (16 effective maneuvers, with a single hunting event). Subsequently, the scheme is validated on the standard IEEE 13-node network, with four disturbances applied over 600 s (two load increments, photovoltaic injection, and a temporary line disconnection). In this case, regulation remains within a precision zone of ±0.3% for 96.8% of the time, with an average RMS error of 0.23% and 100% efficiency, with no hunting events. The results confirm that a Petri net-based supervisory logic can simultaneously improve the OLTC’s voltage quality and switching efficiency, providing a reproducible alternative for distribution network automation. Full article
(This article belongs to the Section F1: Electrical Power System)
Show Figures

Figure 1

32 pages, 2980 KB  
Article
Optimal Penetration Level of Photovoltaic Units in Distribution Networks Considering Engineering and Economic Performance Using the Pied Kingfisher Optimizer
by Chau Le Thi Minh, Hong Hai Pham, Thang Trung Nguyen, Minh Quan Duong and Marco Mussetta
Electronics 2026, 15(8), 1674; https://doi.org/10.3390/electronics15081674 - 16 Apr 2026
Viewed by 94
Abstract
This study proposes a new approach for optimizing the penetration level of photovoltaic units (PVUs) to achieve both engineering and economic benefits in a standard distribution power system. The Mirage Search Optimization (MSO) and the Pied Kingfisher Optimizer (PKO) are applied to minimize [...] Read more.
This study proposes a new approach for optimizing the penetration level of photovoltaic units (PVUs) to achieve both engineering and economic benefits in a standard distribution power system. The Mirage Search Optimization (MSO) and the Pied Kingfisher Optimizer (PKO) are applied to minimize the total active power loss (TRPL) in the IEEE 69-node system. Two cases are considered: Case 1, where PVUs inject only active power, and Case 2, where PVUs inject both active and reactive power. The results demonstrate that PKO outperforms MSO and several metaheuristic algorithms reported in the literature. In Case 2, the optimal PVU penetration level of 67.17% significantly reduces TRPL compared with Case 1. The effectiveness of this optimized penetration level is further evaluated by comparing it with four other penetration levels: 25%, 50%, 75%, and 100%. PKO is then used to optimize the 24 h energy cost considering load variation and dynamic PV generation during four months of the year, including December, September, June, and March, which are ordered by increasing solar radiation across seasons under Vietnam’s climatic conditions. The results show that although the 75% penetration level slightly reduces the energy purchasing cost compared with the optimal level, it requires higher power capacity. Therefore, the optimized penetration level of 67.17% provides a balanced solution for reducing power losses while maintaining economic efficiency. Full article
(This article belongs to the Section Industrial Electronics)
18 pages, 933 KB  
Article
Optimal Performance Design of Passive Power Filters Using a Multi-Objective Firefly Algorithm
by Mahmoud B. Mahmoud, Amira M. Salama, Mustafa AL-Tawfiq, Khaled H. Ibrahim and Eslam M. Abd Elaziz
AppliedMath 2026, 6(4), 62; https://doi.org/10.3390/appliedmath6040062 - 16 Apr 2026
Viewed by 96
Abstract
Harmonic distortion in power systems, primarily caused by nonlinear loads, leads to significant power quality issues such as increased losses, reduced power factor, and equipment malfunctions. To mitigate these effects, passive power filters (PPFs) are widely employed due to their cost-effectiveness and simplicity. [...] Read more.
Harmonic distortion in power systems, primarily caused by nonlinear loads, leads to significant power quality issues such as increased losses, reduced power factor, and equipment malfunctions. To mitigate these effects, passive power filters (PPFs) are widely employed due to their cost-effectiveness and simplicity. This paper presents an optimized design of a single-tuned passive filter (STPF) using the Firefly Algorithm (FFA) and its multi-objective extension, the Multi-Objective Firefly Algorithm (MOFA). The optimization aims to minimize both voltage total harmonic distortion (VTHD) and power loss and to maximize the power factor (PF) while complying with IEEE 519-2014 standards. The study evaluates the proposed method under two different industrial case studies with varying system parameters and harmonic profiles. Simulation results demonstrate that the proposed FFA-based optimization outperforms the Mixed Integer Distributed Ant Colony Optimization (MIDACO) method, achieving superior VTHD reduction, power loss minimization, and power factor enhancement. The MOFA approach provides a Pareto-optimal front, offering trade-offs among competing objectives. Comparative analysis confirms the efficiency, robustness, and faster convergence of FFA-based optimization, making it a promising approach for optimal filter design in power systems. Full article
Show Figures

Figure 1

37 pages, 1793 KB  
Systematic Review
The Role of Artificial Intelligence in Prognosis, Recurrence Prediction, and Treatment Outcomes in Laryngeal Cancer: A Systematic Review
by Hadi Afandi Al-Hakami, Ismail A. Abdullah, Nora S. Almutairi, Rimaz R. Aldawsari, Ghadah Ali Alluqmani, Halah Ahmed Fallatah, Yara Saud Alsulami, Elyas Mohammed Alasiri, Rahaf D. Alsufyani, Raghad Ayman Alorabi and Reffal Mohammad Aldainiy
Cancers 2026, 18(8), 1257; https://doi.org/10.3390/cancers18081257 - 16 Apr 2026
Viewed by 246
Abstract
Background: Laryngeal cancer (LC), a common subtype of head and neck cancers (HNC), is most frequently represented by laryngeal squamous cell carcinoma (LSCC). Prognosis largely depends on early detection; however, traditional prognostic tools, including tumor-node-metastasis (TNM) staging, often show limited predictive accuracy. Artificial [...] Read more.
Background: Laryngeal cancer (LC), a common subtype of head and neck cancers (HNC), is most frequently represented by laryngeal squamous cell carcinoma (LSCC). Prognosis largely depends on early detection; however, traditional prognostic tools, including tumor-node-metastasis (TNM) staging, often show limited predictive accuracy. Artificial intelligence (AI), including machine learning (ML), natural language processing, and deep learning (DL), has emerged as a promising approach to improving cancer diagnosis, prognosis, and treatment planning by analyzing clinical data and medical imaging. Objective: This systematic review assesses the role of AI in prognosis, recurrence prediction, and treatment outcomes in LC. Methods: PubMed, MEDLINE, Scopus, Web of Science, IEEE Xplore, and ScienceDirect were searched up to January 2025. A total of 1062 records were identified; after title/abstract screening and full-text assessment, 29 studies were included. Eligible studies involved adult patients with LC and applied AI to diagnose, prognose, predict recurrence, or assess treatment outcomes using human datasets. Study quality and risk of bias were evaluated using the QUADAS-2 and QUIPS. Results: The 29 included studies were mostly retrospective, with sample sizes ranging from 10 to 63,000 patients. Most focused on LSCC, with a higher prevalence in males. The studies utilized various AI techniques, including deep learning models such as convolutional neural networks (CNNs) and DeepSurv, as well as ML algorithms like random survival forest, gradient boosting machines, random forest, k-nearest neighbors, naïve Bayes, and decision trees. AI models demonstrated strong prognostic performance, surpassing Cox regression and TNM staging in predicting survival and recurrence. Several studies reported outcomes related to treatment, such as chemotherapy response, occult lymph node metastasis, and the need for salvage surgery. Methodological quality varied, with biases related to patient selection and confounding factors. Conclusions: AI has the potential to improve prognosis estimation, recurrence prediction, and treatment outcome assessment in LC. However, although AI can be a helpful addition to clinical decision-making, more prospective studies, external validation, and standardized evaluation are necessary before these technologies can be confidently adopted in everyday clinical practice. Full article
(This article belongs to the Topic Machine Learning and Deep Learning in Medical Imaging)
Show Figures

Figure 1

27 pages, 1201 KB  
Review
Brain–Computer Interfaces in Learning Disorders and Mathematical Learning: A Scoping Review with Structured Narrative Synthesis
by Viktoriya Galitskaya, Georgios Polydoros, Alexandros-Stamatios Antoniou, Pantelis Pergantis and Athanasios Drigas
Appl. Sci. 2026, 16(8), 3846; https://doi.org/10.3390/app16083846 - 15 Apr 2026
Viewed by 320
Abstract
Brain–Computer Interfaces (BCIs) have increasingly been explored as tools for monitoring and modulating cognitive processes relevant to learning. However, their application to learning disorders, and especially to mathematical learning difficulties such as dyscalculia and ageometria, remains conceptually promising but empirically underdeveloped. The present [...] Read more.
Brain–Computer Interfaces (BCIs) have increasingly been explored as tools for monitoring and modulating cognitive processes relevant to learning. However, their application to learning disorders, and especially to mathematical learning difficulties such as dyscalculia and ageometria, remains conceptually promising but empirically underdeveloped. The present study offers a scoping review with structured narrative synthesis of recent empirical research on BCI-based interventions in learning disorder populations, with particular attention paid to their possible translational relevance for mathematical learning. Following PRISMA-ScR principles and a Population–Concept–Context framework, studies published between 2020 and 2025 were identified through database searches in Scopus, IEEE Xplore, and PubMed. A total of 30 studies met the inclusion criteria. All eligible studies focused on Attention-Deficit/Hyperactivity Disorder (ADHD), while no eligible BCI intervention studies were found for dyscalculia or ageometria. The reviewed literature was dominated by EEG-based neurofeedback interventions. To move beyond descriptive summary, the included studies were organized using a structured analytical framework based on intervention modality, primary cognitive target, methodological robustness, and translational proximity to mathematical learning disorders. Across the evidence base, the most consistent findings concerned attention regulation and executive function outcomes, whereas academic and mathematics-related outcomes were sparse and methodologically less developed. Although several studies suggested improvements in domain-general cognitive mechanisms relevant to mathematical learning, the absence of direct evidence in dyscalculia and ageometria prevents confirmatory conclusions. The review therefore identifies both the promise and the limits of current BCI applications in learning disorder contexts and argues that future research should prioritize theory-driven, disorder-specific trials targeting numeracy, visuospatial reasoning, and executive processes in mathematical learning disabilities. Although current findings suggest promising cognitive and educational potential, these technologies are not yet ready for routine implementation in standard classroom environments without further validation, teacher training, ethical safeguards, and cost-effective deployment models. Full article
Show Figures

Figure 1

52 pages, 3234 KB  
Perspective
Edge-Intelligent and Cyber-Resilient Coordination of Electric Vehicles and Distributed Energy Resources in Modern Distribution Grids
by Mahmoud Ghofrani
Energies 2026, 19(8), 1867; https://doi.org/10.3390/en19081867 - 10 Apr 2026
Viewed by 408
Abstract
The rapid electrification of transportation and proliferation of distributed energy resources (DERs) are transforming distribution grids into highly dynamic, data-intensive, and cyber-physical systems. While reinforcement learning (RL), multi-agent coordination, and edge computing offer powerful tools for adaptive control, their deployment in safety-critical utility [...] Read more.
The rapid electrification of transportation and proliferation of distributed energy resources (DERs) are transforming distribution grids into highly dynamic, data-intensive, and cyber-physical systems. While reinforcement learning (RL), multi-agent coordination, and edge computing offer powerful tools for adaptive control, their deployment in safety-critical utility environments raises concerns regarding stability, certification compatibility, cyber-resilience, and regulatory acceptance. This paper presents an architecture-centric framework for edge-intelligent and cyber-resilient coordination of electric vehicles (EVs) and DERs that reconciles adaptive learning with deterministic safety guarantees. The proposed hierarchical edge–cloud architecture integrates multi-agent system (MAS) coordination, constraint-invariant reinforcement learning, and embedded cybersecurity mechanisms within a structured control hierarchy. Learning-enabled edge agents operate exclusively within standards-compliant safety envelopes enforced through supervisory constraint projection, control barrier functions, and Lyapunov-consistent stability safeguards. Protection-critical functions remain deterministic and isolated from adaptive layers, preserving compatibility with IEEE 1547 and existing utility protection schemes. The framework further incorporates anomaly triggered policy freezing, fail-safe fallback modes, and communication-aware resilience mechanisms to prevent unsafe transient behavior in non-stationary, distributed environments. Unlike simulation-only learning approaches, the architecture embeds progressive validation through software-in-the-loop (SIL), hardware-in-the-loop (HIL), and power hardware-in-the-loop (PHIL) testing to empirically verify transient stability, constraint compliance, and cyber-resilience under realistic timing and disturbance conditions. Beyond technical performance, the paper situates edge intelligence within standards evolution, governance structures, workforce transformation, techno-economic assessment, and equitable deployment pathways. By framing adaptive control as a bounded, auditable augmentation layer rather than a disruptive replacement for certified infrastructure, the proposed architecture provides a pragmatic roadmap for evolutionary modernization of distribution systems. Full article
(This article belongs to the Section E: Electric Vehicles)
Show Figures

Figure 1

25 pages, 854 KB  
Systematic Review
Hybrid Machine Learning Architectures for Emergency Triage: A Systematic Review of Predictive Performance and the Complexity Gradient
by Junaid Ullah, R. Kanesaraj Ramasamay and Venushini Rajendran
BioMedInformatics 2026, 6(2), 21; https://doi.org/10.3390/biomedinformatics6020021 - 10 Apr 2026
Viewed by 336
Abstract
Background: Emergency triage systems using machine learning traditionally rely on structured tabular data (vital signs), creating a “contextual blind spot” that ignores diagnostic information embedded in unstructured clinical narratives. Hybrid AI models that fuse tabular and text data may improve predictive discrimination, but [...] Read more.
Background: Emergency triage systems using machine learning traditionally rely on structured tabular data (vital signs), creating a “contextual blind spot” that ignores diagnostic information embedded in unstructured clinical narratives. Hybrid AI models that fuse tabular and text data may improve predictive discrimination, but the magnitude and conditions under which fusion adds value remain unclear. Methods: Five databases (PubMed, Scopus, Web of Science, IEEE Xplore, ACM Digital Library) were searched from 1 January 2015 to 15 December 2025. Eligible studies employed Hybrid AI models integrating structured and unstructured emergency department data with quantitative baseline comparisons. Twenty-five studies (N ≈ 4.8 million encounters) met inclusion criteria. We extracted marginal performance gains (ΔAUC), calibration metrics, and demographic reporting. Synthesis followed SWiM principles with subgroup meta-regression testing our novel “Complexity Gradient” hypothesis. Results: Hybrid models demonstrated superior discrimination compared to tabular baselines, with effect magnitude dependent on clinical task complexity. Low-complexity tasks (tachycardia prediction) showed minimal gains (median ΔAUC + 0.036, IQR: 0.02–0.05), while high-complexity tasks (hypoxia, sepsis) demonstrated substantial improvement (median ΔAUC + 0.111, IQR: 0.09–0.13). Meta-regression confirmed complexity significantly moderated effect size (R2 = 0.42, p = 0.003). Only 12% (3/25) of studies reported calibration metrics (Brier scores: 0.089–0.142). Zero studies stratified performance by race/ethnicity; 88% (22/25) failed to report training data demographics. Discussion: The complexity gradient framework explains when multimodal fusion adds predictive value: tasks where diagnostic signal resides in narrative features (temporality, negation) rather than physiological measurements. However, systematic absence of calibration reporting and fairness auditing prevents clinical deployment. Seventy-two percent of studies had high risk of bias in the analysis domain due to retrospective designs without temporal validation. Conclusions: Hybrid triage models show promise for complex diagnostic tasks but require mandatory calibration reporting and demographic performance stratification before clinical implementation. We propose minimum reporting standards including Brier scores, race-stratified metrics, and temporal validation protocols. Full article
Show Figures

Figure 1

41 pages, 84120 KB  
Article
DDS-over-TSN Framework for Time-Critical Applications in Industrial Metaverses
by Taemin Nam, Seongjin Yun and Won-Tae Kim
Appl. Sci. 2026, 16(8), 3641; https://doi.org/10.3390/app16083641 - 8 Apr 2026
Viewed by 273
Abstract
The industrial metaverse is a digital twin space that integrates the real world with virtual environments through bidirectional synchronization. It supports critical services, such as time-sensitive machine control and large-scale collaboration, which require Time-Sensitive Networking and scalable Data Distribution Services. DDS, developed by [...] Read more.
The industrial metaverse is a digital twin space that integrates the real world with virtual environments through bidirectional synchronization. It supports critical services, such as time-sensitive machine control and large-scale collaboration, which require Time-Sensitive Networking and scalable Data Distribution Services. DDS, developed by the Object Management Group, provides excellent scalability and diverse QoS policies but struggles to guarantee transmission delay and jitter for time-critical applications. TSN, based on IEEE 802.1 standards, addresses these challenges by ensuring time-criticality. However, current research lacks comprehensive integration mechanisms for DDS and TSN, particularly from the viewpoints of semantics and system framework. Additionally, there is no adaptive QoS mapping converting the abstract DDS QoS policies to the sophisticated TSN QoS parameters. This paper presents a novel DDS-over-TSN framework that incorporates three key functions to address these challenges. First, Cross-layer QoS Mapping automates correspondences between DDS and TSN parameters, deriving technical constraints from standard documentation through retrieval-augmented generation. Second, Semantic Priority Estimation extracts substantial priority levels by utilizing language model embedding vectors as high-dimensional feature extractors. Third, Adaptive Resource Allocation performs dynamic bandwidth distribution for each priority level through reinforcement learning. Simulation results reveal over 99% mapping accuracy and 97% consistency in priority extraction. The applied Deep Reinforcement Learning paradigm allocated 99% of required resources to high-priority classes and reduced resource wastage by 15% compared to conventional methods. This methodology meets industrial requirements by ensuring both deterministic real-time performance and efficient resource isolation. Full article
(This article belongs to the Special Issue Digital Twin and IoT, 2nd Edition)
Show Figures

Figure 1

36 pages, 3666 KB  
Article
StegoPadding: A Steganographic Channel with QoS Support and Encryption for Smart Grids Based on Wi-Fi Networks
by Paweł Rydz and Marek Natkaniec
Electronics 2026, 15(7), 1504; https://doi.org/10.3390/electronics15071504 - 3 Apr 2026
Viewed by 333
Abstract
Wi-Fi networks used in smart grids are essential for enabling communication between smart meters and data aggregation units. A key challenge, however, is the ability to hide the existence and traffic patterns of these communications, so that sensitive information exchanges cannot be easily [...] Read more.
Wi-Fi networks used in smart grids are essential for enabling communication between smart meters and data aggregation units. A key challenge, however, is the ability to hide the existence and traffic patterns of these communications, so that sensitive information exchanges cannot be easily detected or intercepted. Unfortunately, most existing solutions do not provide support for traffic prioritization and steganographic channel encryption. In this paper, we propose a novel covert channel with Quality of Service (QoS) and encryption support for smart grid environments based on the IEEE 802.11 standard. We introduce an original steganographic approach that leverages the backoff mechanism, the Enhanced Distributed Channel Access (EDCA) function, frame aggregation, and the StegoPaddingCipher algorithm. This design ensures QoS-aware traffic handling while enhancing security through encryption of the transmitted covert data. The proposed protocol was implemented and evaluated using the ns-3 simulator, where it achieved excellent performance results. The system maintained high efficiency even under heavily saturated network conditions with additional background traffic generated by other nodes. The proposed covert channel offers an innovative and secure method for transmitting substantial volumes of QoS-related data within smart grid environments. Full article
(This article belongs to the Special Issue Communication Technologies for Smart Grid Application)
Show Figures

Figure 1

21 pages, 1356 KB  
Review
Biomimetic Strategies in Orthosis Design: A Scoping Review of Biological Abstraction and Functional Translation
by Tim Tchervonenko, Alexander Sauer, Thabata Alcântara Ferreira Ganga, Heike Beismann, Eduardo Keller Rorato, Míriam Raquel Diniz Zanetti and Maria Elizete Kunkel
Biomimetics 2026, 11(4), 241; https://doi.org/10.3390/biomimetics11040241 - 3 Apr 2026
Viewed by 467
Abstract
Orthoses are widely used to support or modulate neuromuscular and skeletal function; however, their clinical effectiveness is often limited by discomfort, poor adaptability, and suboptimal human–device interaction. Biomimetics has emerged as a structured design paradigm capable of enhancing orthotic performance by systematically translating [...] Read more.
Orthoses are widely used to support or modulate neuromuscular and skeletal function; however, their clinical effectiveness is often limited by discomfort, poor adaptability, and suboptimal human–device interaction. Biomimetics has emerged as a structured design paradigm capable of enhancing orthotic performance by systematically translating biological principles into engineering solutions. This scoping review examined biomimetic strategies in the development of orthoses. A structured search was conducted across PubMed, IEEE Xplore, Web of Science, and Scopus (2000–2025). Of 453 identified records, 14 met the inclusion criteria. Biomimetic orthosis research emerged predominantly after 2012, with increased activity after 2021. Human-based biological models, particularly muscle–tendon systems, predominated. Most studies relied on functional abstraction and were implemented using cable-driven or electromechanical actuation. None of the included studies explicitly referenced established biomimetics standards (e.g., ISO 18458), and descriptions of biological analysis, abstraction, and transfer were frequently incomplete. Experimental validation was generally limited to prototype-level testing, small sample sizes, and short-term evaluations, with no longitudinal or multicenter studies identified. These findings reveal a structural imbalance between conceptual biomimetic inspiration and structured methodological implementation. Based on this analysis, a structured biomimetic workflow is proposed to enhance traceability, reporting clarity, and clinical translation in the development of orthosis. Full article
Show Figures

Graphical abstract

32 pages, 8873 KB  
Article
Experimental Verification of IEEE, CIGRÉ and IEC Thermal Models for Dynamic Line Rating of ACSR Overhead Lines
by Miloš Milovanović, Andrijana Jovanović, Mladen Banjanin, Ilija Vukašinović, Branko Gvozdić, Aleksandar Žorić, Bojan Perović and Jovan Vukašinović
Electricity 2026, 7(2), 34; https://doi.org/10.3390/electricity7020034 - 2 Apr 2026
Viewed by 279
Abstract
This paper presents an experimental investigation of dynamic line rating (DLR) applied to aluminium conductor steel-reinforced (ACSR) overhead line conductors, with a specific focus on wind speed conditions up to 5 m/s. An experimental system was designed and implemented to provide controlled and [...] Read more.
This paper presents an experimental investigation of dynamic line rating (DLR) applied to aluminium conductor steel-reinforced (ACSR) overhead line conductors, with a specific focus on wind speed conditions up to 5 m/s. An experimental system was designed and implemented to provide controlled and repeatable cross-flow air conditions along a tested ACSR conductor, enabling direct measurement of wind speed in the immediate vicinity of the conductor surface. Conductor temperature, electrical current, voltage drop per unit length, the phase angle between them, and relevant meteorological parameters were continuously measured under controlled experimental conditions. Based on the measured data, the conductor heat balance was evaluated and the allowable current-carrying capacity was determined. The experimentally obtained conductor temperatures and ampacity values were compared with results calculated using thermal models and correlations recommended by IEEE, CIGRÉ, and IEC standards. The comparison demonstrates that, under low and moderate wind speed conditions, differences between standard-based predictions and experimental results can be significant, leading to deviations in the estimation of allowable current-carrying capacity. The results confirm the high sensitivity of DLR calculations to wind-related assumptions and provide an experimentally validated basis for assessing the applicability and limitations of existing standard thermal models for ACSR conductors under realistic operating conditions. Full article
Show Figures

Figure 1

28 pages, 3304 KB  
Article
A Two-Stage Stochastic Programming Approach to Unit Commitment with Wind Power Integration: A Novel Pricing Scheme
by Jiaxu Huang, Jie Tao and Dingfang Su
Sustainability 2026, 18(7), 3479; https://doi.org/10.3390/su18073479 - 2 Apr 2026
Viewed by 243
Abstract
With high wind power penetration, power system operations face significant uncertainty, rendering traditional pricing mechanisms inadequate for stochastic dispatch environments and hindering the sustainable development of power systems with high renewable energy integration. This paper systematically compares three electricity pricing schemes—system marginal pricing, [...] Read more.
With high wind power penetration, power system operations face significant uncertainty, rendering traditional pricing mechanisms inadequate for stochastic dispatch environments and hindering the sustainable development of power systems with high renewable energy integration. This paper systematically compares three electricity pricing schemes—system marginal pricing, conservative pricing, and the proposed average pricing—within a two-stage stochastic unit commitment framework. It is found that system marginal pricing behaves as an ex post pricing method dependent on scenario realizations and lacks stability, whereas conservative pricing degenerates into a scheme based on the minimum wind output scenario, leading to higher and more volatile prices. To address these issues, this paper proposes a novel “Average Pricing” method, in which the day-ahead price is defined as the expected value of marginal prices across all wind power scenarios. Theoretical analysis and numerical simulations on the IEEE 39-bus system demonstrate that the proposed method offers both economic interpretability and numerical stability, with mean prices ranging from 14.0739 to 15.9825 and standard deviations ranging from 16.6323 to 19.9471 across four seasonal cases. Compared with conservative pricing, it achieves lower mean prices in three seasons and lower price volatility in three seasons while maintaining a unique day-ahead price and providing a novel and sustainable pathway for pricing design in power systems with high renewable energy integration. Full article
Show Figures

Figure 1

28 pages, 495 KB  
Review
Securing the Cognitive Layer: A Survey on Security Threats, Defenses, and Privacy-Preserving Architectures for LLM-IoT Integration
by Ayan Joshi and Sabur Baidya
J. Cybersecur. Priv. 2026, 6(2), 63; https://doi.org/10.3390/jcp6020063 - 2 Apr 2026
Viewed by 647
Abstract
The convergence of Large Language Models (LLMs) and Internet of Things (IoT) systems has created a new class of intelligent applications across healthcare, industrial automation, smart cities, and connected homes. However, this integration introduces a complex and largely underexplored security landscape. LLMs deployed [...] Read more.
The convergence of Large Language Models (LLMs) and Internet of Things (IoT) systems has created a new class of intelligent applications across healthcare, industrial automation, smart cities, and connected homes. However, this integration introduces a complex and largely underexplored security landscape. LLMs deployed in IoT contexts face threats spanning both the AI and embedded systems domains, including prompt injection through sensor-driven inputs, model extraction from edge devices, data poisoning of IoT data streams, and privacy leakage through LLM-generated responses grounded in personal data. Simultaneously, LLMs are proving to be powerful tools for IoT security, with LLM-based intrusion detection systems achieving 95–99% accuracy on standard IoT datasets and LLM-driven threat intelligence outperforming traditional machine learning by significant margins. We systematically review 88 papers from IEEE, ACM, MDPI, and arXiv (2020–2025), providing: (1) a structured taxonomy of security threats targeting LLM-IoT systems, (2) a review of LLMs as security enablers for IoT, (3) an evaluation of privacy-preserving architectures including federated learning, differential privacy, homomorphic encryption, and trusted execution environments, (4) domain-specific security analysis across healthcare, industrial, smart home, smart grid, and vehicular IoT, and (5) a literature-based comparative analysis of LLM-based security systems. A central finding is the accuracy–efficiency–privacy trilemma: the model compression techniques needed to deploy LLMs on resource-constrained IoT devices can degrade security and even introduce new vulnerabilities. Our analysis provides researchers and practitioners with a structured understanding of both the risks and opportunities at the frontier of LLM-IoT security. Full article
Show Figures

Figure 1

Back to TopTop