Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (121)

Search Parameters:
Keywords = cyber resilience challenges

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 8383 KiB  
Article
A Resilience Quantitative Assessment Framework for Cyber–Physical Systems: Mathematical Modeling and Simulation
by Zhigang Cao, Hantao Zhao, Yunfan Wang, Chuan He, Ding Zhou and Xiaopeng Han
Appl. Sci. 2025, 15(15), 8285; https://doi.org/10.3390/app15158285 - 25 Jul 2025
Viewed by 108
Abstract
As cyber threats continue to grow in complexity and persistence, resilience has become a critical requirement for cyber–physical systems (CPSs). Resilience quantitative assessment is essential for supporting secure system design and ensuring reliable operation. Although various methods have been proposed for evaluating CPS [...] Read more.
As cyber threats continue to grow in complexity and persistence, resilience has become a critical requirement for cyber–physical systems (CPSs). Resilience quantitative assessment is essential for supporting secure system design and ensuring reliable operation. Although various methods have been proposed for evaluating CPS resilience, major challenges remain in accurately modeling the interaction between cyber and physical domains and in providing structured guidance for resilience-oriented design. This study proposes an integrated CPS resilience assessment framework that combines cyber-layer anomaly modeling based on Markov chains with mathematical modeling of performance degradation and recovery in the physical domain. The framework establishes a structured evaluation process through parameter normalization and cyber–physical coupling, enabling the generation of resilience curves that clearly represent system performance changes under adverse conditions. A case study involving an industrial controller equipped with a diversity-redundancy architecture is conducted to demonstrate the applicability of the proposed method. Modeling and simulation results indicate that the framework effectively reveals key resilience characteristics and supports performance-informed design optimization. Full article
Show Figures

Figure 1

19 pages, 1040 KiB  
Systematic Review
A Systematic Review on Risk Management and Enhancing Reliability in Autonomous Vehicles
by Ali Mahmood and Róbert Szabolcsi
Machines 2025, 13(8), 646; https://doi.org/10.3390/machines13080646 - 24 Jul 2025
Viewed by 297
Abstract
Autonomous vehicles (AVs) hold the potential to revolutionize transportation by improving safety, operational efficiency, and environmental impact. However, ensuring reliability and safety in real-world conditions remains a major challenge. Based on an in-depth examination of 33 peer-reviewed studies (2015–2025), this systematic review organizes [...] Read more.
Autonomous vehicles (AVs) hold the potential to revolutionize transportation by improving safety, operational efficiency, and environmental impact. However, ensuring reliability and safety in real-world conditions remains a major challenge. Based on an in-depth examination of 33 peer-reviewed studies (2015–2025), this systematic review organizes advancements across five key domains: fault detection and diagnosis (FDD), collision avoidance and decision making, system reliability and resilience, validation and verification (V&V), and safety evaluation. It integrates both hardware- and software-level perspectives, with a focus on emerging techniques such as Bayesian behavior prediction, uncertainty-aware control, and set-based fault detection to enhance operational robustness. Despite these advances, this review identifies persistent challenges, including limited cross-layer fault modeling, lack of formal verification for learning-based components, and the scarcity of scenario-driven validation datasets. To address these gaps, this paper proposes future directions such as verifiable machine learning, unified fault propagation models, digital twin-based reliability frameworks, and cyber-physical threat modeling. This review offers a comprehensive reference for developing certifiable, context-aware, and fail-operational autonomous driving systems, contributing to the broader goal of ensuring safe and trustworthy AV deployment. Full article
Show Figures

Figure 1

35 pages, 3265 KiB  
Article
Cyber Edge: Current State of Cybersecurity in Aotearoa-New Zealand, Opportunities, and Challenges
by Md. Rajib Hasan, Nurul I. Sarkar, Noor H. S. Alani and Raymond Lutui
Electronics 2025, 14(14), 2915; https://doi.org/10.3390/electronics14142915 - 21 Jul 2025
Viewed by 355
Abstract
This study investigates the cybersecurity landscape of Aotearoa-New Zealand through a culturally grounded lens, focusing on the integration of Indigenous Māori values into cybersecurity frameworks. In response to escalating cyber threats, the research adopts a mixed-methods and interdisciplinary approach—combining surveys, focus groups, and [...] Read more.
This study investigates the cybersecurity landscape of Aotearoa-New Zealand through a culturally grounded lens, focusing on the integration of Indigenous Māori values into cybersecurity frameworks. In response to escalating cyber threats, the research adopts a mixed-methods and interdisciplinary approach—combining surveys, focus groups, and case studies—to explore how cultural principles such as whanaungatanga (collective responsibility) and manaakitanga (care and respect) influence digital safety practices. The findings demonstrate that culturally informed strategies enhance trust, resilience, and community engagement, particularly in rural and underserved Māori communities. Quantitative analysis revealed that 63% of urban participants correctly identified phishing attempts compared to 38% of rural participants, highlighting a significant urban–rural awareness gap. Additionally, over 72% of Māori respondents indicated that cybersecurity messaging was more effective when delivered through familiar cultural channels, such as marae networks or iwi-led training programmes. Focus groups reinforced this, with participants noting stronger retention and behavioural change when cyber risks were communicated using Māori metaphors, language, or values-based analogies. The study also confirms that culturally grounded interventions—such as incorporating Māori motifs (e.g., koru, poutama) into secure interface design and using iwi structures to disseminate best practices—can align with international standards like NIST CSF and ISO 27001. This compatibility enhances stakeholder buy-in and demonstrates universal applicability in multicultural contexts. Key challenges identified include a cybersecurity talent shortage in remote areas, difficulties integrating Indigenous perspectives into mainstream policy, and persistent barriers from the digital divide. The research advocates for cross-sector collaboration among government, private industry, and Indigenous communities to co-develop inclusive, resilient cybersecurity ecosystems. Based on the UTAUT and New Zealand’s cybersecurity vision “Secure Together—Tō Tātou Korowai Manaaki 2023–2028,” this study provides a model for small nations and multicultural societies to create robust, inclusive cybersecurity frameworks. Full article
(This article belongs to the Special Issue Intelligent Solutions for Network and Cyber Security)
Show Figures

Figure 1

25 pages, 4186 KiB  
Review
Total Productive Maintenance and Industry 4.0: A Literature-Based Path Toward a Proposed Standardized Framework
by Zineb Mouhib, Maryam Gallab, Safae Merzouk, Aziz Soulhi and Mario Di Nardo
Appl. Syst. Innov. 2025, 8(4), 98; https://doi.org/10.3390/asi8040098 - 21 Jul 2025
Viewed by 515
Abstract
In the context of Industry 4.0, Total Productive Maintenance (TPM) is undergoing a major shift driven by digital technologies such as the IoT, AI, cloud computing, and Cyber–Physical systems. This study explores how these technologies reshape traditional TPM pillars and practices through a [...] Read more.
In the context of Industry 4.0, Total Productive Maintenance (TPM) is undergoing a major shift driven by digital technologies such as the IoT, AI, cloud computing, and Cyber–Physical systems. This study explores how these technologies reshape traditional TPM pillars and practices through a two-phase methodology: bibliometric analysis, which reveals global research trends, key contributors, and emerging themes, and a systematic review, which discusses how core TPM practices are being transformed by advanced technologies. It also identifies key challenges of this transition, including data aggregation, a lack of skills, and resistance. However, despite the growing body of research on digital TPM, a major gap persists: the lack of a standardized model applicable across industries. Existing approaches are often fragmented or too context-specific, limiting scalability. Addressing this gap requires a structured approach that aligns technological advancements with TPM’s foundational principles. Taking a cue from these findings, this article formulates a systematic and scalable framework for TPM 4.0 deployment. The framework is based on four pillars: modular technological architecture, phased deployment, workforce integration, and standardized performance indicators. The ultimate goal is to provide a basis for a universal digital TPM standard that enhances the efficiency, resilience, and efficacy of smart maintenance systems. Full article
(This article belongs to the Section Industrial and Manufacturing Engineering)
Show Figures

Figure 1

42 pages, 2129 KiB  
Review
Ensemble Learning Approaches for Multi-Class Intrusion Detection Systems for the Internet of Vehicles (IoV): A Comprehensive Survey
by Manal Alharthi, Faiza Medjek and Djamel Djenouri
Future Internet 2025, 17(7), 317; https://doi.org/10.3390/fi17070317 - 19 Jul 2025
Viewed by 385
Abstract
The emergence of the Internet of Vehicles (IoV) has revolutionized intelligent transportation and communication systems. However, IoV presents many complex and ever-changing security challenges and thus requires robust cybersecurity protocols. This paper comprehensively describes and evaluates ensemble learning approaches for multi-class intrusion detection [...] Read more.
The emergence of the Internet of Vehicles (IoV) has revolutionized intelligent transportation and communication systems. However, IoV presents many complex and ever-changing security challenges and thus requires robust cybersecurity protocols. This paper comprehensively describes and evaluates ensemble learning approaches for multi-class intrusion detection systems in the IoV environment. The study evaluates several approaches, such as stacking, voting, boosting, and bagging. A comprehensive review of the literature spanning 2020 to 2025 reveals important trends and topics that require further investigation and the relative merits of different ensemble approaches. The NSL-KDD, CICIDS2017, and UNSW-NB15 datasets are widely used to evaluate the performance of Ensemble Learning-Based Intrusion Detection Systems (ELIDS). ELIDS evaluation is usually carried out using some popular performance metrics, including Precision, Accuracy, Recall, F1-score, and Area Under Receiver Operating Characteristic Curve (AUC-ROC), which were used to evaluate and measure the effectiveness of different ensemble learning methods. Given the increasing complexity and frequency of cyber threats in IoV environments, ensemble learning methods such as bagging, boosting, and stacking enhance adaptability and robustness. These methods aggregate multiple learners to improve detection rates, reduce false positives, and ensure more resilient intrusion detection models that can evolve alongside emerging attack patterns. Full article
Show Figures

Figure 1

12 pages, 231 KiB  
Systematic Review
Cybersecurity Issues in Electrical Protection Relays: A Systematic Review
by Giovanni Battista Gaggero, Paola Girdinio and Mario Marchese
Energies 2025, 18(14), 3796; https://doi.org/10.3390/en18143796 - 17 Jul 2025
Viewed by 235
Abstract
The increasing digitalization of power systems has revolutionized the functionality and efficiency of electrical protection relays. These digital relays enhance fault detection, monitoring, and response mechanisms, ensuring the reliability and stability of power networks. However, their connectivity and reliance on communication protocols introduce [...] Read more.
The increasing digitalization of power systems has revolutionized the functionality and efficiency of electrical protection relays. These digital relays enhance fault detection, monitoring, and response mechanisms, ensuring the reliability and stability of power networks. However, their connectivity and reliance on communication protocols introduce significant cybersecurity risks, making them potential targets for malicious attacks. Cyber threats against digital protection relays can lead to severe consequences, including cascading failures, equipment damage, and compromised grid security. This paper presents a comprehensive review of cybersecurity challenges in digital electrical protection relays, focusing on four key areas: (1) a taxonomy of cyber attack models targeting protection relays, (2) the associated risks and their potential impact on power systems, (3) existing mitigation strategies to enhance relay security, and (4) future research directions to strengthen resilience against cyber threats. Full article
Show Figures

Figure 1

27 pages, 2260 KiB  
Article
Machine Learning for Industrial Optimization and Predictive Control: A Patent-Based Perspective with a Focus on Taiwan’s High-Tech Manufacturing
by Chien-Chih Wang and Chun-Hua Chien
Processes 2025, 13(7), 2256; https://doi.org/10.3390/pr13072256 - 15 Jul 2025
Viewed by 686
Abstract
The global trend toward Industry 4.0 has intensified the demand for intelligent, adaptive, and energy-efficient manufacturing systems. Machine learning (ML) has emerged as a crucial enabler of this transformation, particularly in high-mix, high-precision environments. This review examines the integration of machine learning techniques, [...] Read more.
The global trend toward Industry 4.0 has intensified the demand for intelligent, adaptive, and energy-efficient manufacturing systems. Machine learning (ML) has emerged as a crucial enabler of this transformation, particularly in high-mix, high-precision environments. This review examines the integration of machine learning techniques, such as convolutional neural networks (CNNs), reinforcement learning (RL), and federated learning (FL), within Taiwan’s advanced manufacturing sectors, including semiconductor fabrication, smart assembly, and industrial energy optimization. The present study draws on patent data and industrial case studies from leading firms, such as TSMC, Foxconn, and Delta Electronics, to trace the evolution from classical optimization to hybrid, data-driven frameworks. A critical analysis of key challenges is provided, including data heterogeneity, limited model interpretability, and integration with legacy systems. A comprehensive framework is proposed to address these issues, incorporating data-centric learning, explainable artificial intelligence (XAI), and cyber–physical architectures. These components align with industrial standards, including the Reference Architecture Model Industrie 4.0 (RAMI 4.0) and the Industrial Internet Reference Architecture (IIRA). The paper concludes by outlining prospective research directions, with a focus on cross-factory learning, causal inference, and scalable industrial AI deployment. This work provides an in-depth examination of the potential of machine learning to transform manufacturing into a more transparent, resilient, and responsive ecosystem. Additionally, this review highlights Taiwan’s distinctive position in the global high-tech manufacturing landscape and provides an in-depth analysis of patent trends from 2015 to 2025. Notably, this study adopts a patent-centered perspective to capture practical innovation trends and technological maturity specific to Taiwan’s globally competitive high-tech sector. Full article
(This article belongs to the Special Issue Machine Learning for Industrial Optimization and Predictive Control)
Show Figures

Figure 1

25 pages, 1799 KiB  
Systematic Review
Cyber-Physical Systems for Smart Farming: A Systematic Review
by Alexis Montalvo, Oscar Camacho and Danilo Chavez
Sustainability 2025, 17(14), 6393; https://doi.org/10.3390/su17146393 - 12 Jul 2025
Viewed by 409
Abstract
In recent decades, climate change, increasing demand, and resource scarcity have transformed the agricultural sector into a critical field of research. Farmers have been compelled to adopt innovations and new technologies to enhance production efficiency and crop resilience. This study presents a systematic [...] Read more.
In recent decades, climate change, increasing demand, and resource scarcity have transformed the agricultural sector into a critical field of research. Farmers have been compelled to adopt innovations and new technologies to enhance production efficiency and crop resilience. This study presents a systematic literature review, supplemented by a bibliometric analysis of relevant documents, focusing on the key applications and combined techniques of artificial intelligence (AI), machine learning (ML), and digital twins (DT) in the development and implementation of cyber-physical systems (CPS) in smart agriculture and establishes whether CPS in agriculture is an attractive research topic. A total of 108 bibliographic records from the Scopus and Google Scholar databases were analyzed to construct the bibliometric study database. The findings reveal that CPS has evolved and emerged as a promising research area, largely due to its versatility and integration potential. The analysis offers researchers and practitioners a comprehensive overview of the existing literature and research trends on the dynamic relationship between CPS and its primary applications in the agricultural industry while encouraging further exploration in this field. Additionally, the main challenges associated with implementing CPS in the context of smart agriculture are discussed, contributing to a deeper understanding of this topic. Full article
Show Figures

Figure 1

24 pages, 7080 KiB  
Review
Responsible Resilience in Cyber–Physical–Social Systems: A New Paradigm for Emergent Cyber Risk Modeling
by Theresa Sobb, Nour Moustafa and Benjamin Turnbull
Future Internet 2025, 17(7), 282; https://doi.org/10.3390/fi17070282 - 25 Jun 2025
Cited by 1 | Viewed by 327
Abstract
As cyber systems increasingly converge with physical infrastructure and social processes, they give rise to Complex Cyber–Physical–Social Systems (C-CPSS), whose emergent behaviors pose unique risks to security and mission assurance. Traditional cyber–physical system models often fail to address the unpredictability arising from human [...] Read more.
As cyber systems increasingly converge with physical infrastructure and social processes, they give rise to Complex Cyber–Physical–Social Systems (C-CPSS), whose emergent behaviors pose unique risks to security and mission assurance. Traditional cyber–physical system models often fail to address the unpredictability arising from human and organizational dynamics, leaving critical gaps in how cyber risks are assessed and managed across interconnected domains. The challenge lies in building resilient systems that not only resist disruption, but also absorb, recover, and adapt—especially in the face of complex, nonlinear, and often unintentionally emergent threats. This paper introduces the concept of ‘responsible resilience’, defined as the capacity of systems to adapt to cyber risks using trustworthy, transparent agent-based models that operate within socio-technical contexts. We identify a fundamental research gap in the treatment of social complexity and emergence in existing the cyber–physical system literature. To address this, we propose the E3R modeling paradigm—a novel framework for conceptualizing Emergent, Risk-Relevant Resilience in C-CPSS. This paradigm synthesizes human-in-the-loop diagrams, agent-based Artificial Intelligence simulations, and ontology-driven representations to model the interdependencies and feedback loops driving unpredictable cyber risk propagation more effectively. Compared to conventional cyber–physical system models, E3R accounts for adaptive risks across social, cyber, and physical layers, enabling a more accurate and ethically grounded foundation for cyber defence and mission assurance. Our analysis of the literature review reveals the underrepresentation of socio-emergent risk modeling in the literature, and our results indicate that existing models—especially those in industrial and healthcare applications of cyber–physical systems—lack the generalizability and robustness necessary for complex, cross-domain environments. The E3R framework thus marks a significant step forward in understanding and mitigating emergent threats in future digital ecosystems. Full article
(This article belongs to the Special Issue Internet of Things and Cyber-Physical Systems, 3rd Edition)
Show Figures

Figure 1

36 pages, 1717 KiB  
Article
Generative Adversarial and Transformer Network Synergy for Robust Intrusion Detection in IoT Environments
by Pardis Sadatian Moghaddam, Ali Vaziri, Sarvenaz Sadat Khatami, Francisco Hernando-Gallego and Diego Martín
Future Internet 2025, 17(6), 258; https://doi.org/10.3390/fi17060258 - 12 Jun 2025
Viewed by 633
Abstract
Intrusion detection in the Internet of Things (IoT) environments is increasingly critical due to the rapid proliferation of connected devices and the growing sophistication of cyber threats. Traditional detection methods often fall short in identifying multi-class attacks, particularly in the presence of high-dimensional [...] Read more.
Intrusion detection in the Internet of Things (IoT) environments is increasingly critical due to the rapid proliferation of connected devices and the growing sophistication of cyber threats. Traditional detection methods often fall short in identifying multi-class attacks, particularly in the presence of high-dimensional and imbalanced IoT traffic. To address these challenges, this paper proposes a novel hybrid intrusion detection framework that integrates transformer networks with generative adversarial networks (GANs), aiming to enhance both detection accuracy and robustness. In the proposed architecture, the transformer component effectively models temporal and contextual dependencies within traffic sequences, while the GAN component generates synthetic data to improve feature diversity and mitigate class imbalance. Additionally, an improved non-dominated sorting biogeography-based optimization (INSBBO) algorithm is employed to fine-tune the hyper-parameters of the hybrid model, further enhancing learning stability and detection performance. The model is trained and evaluated on the CIC-IoT-2023 and TON_IoT dataset, which contains a diverse range of real-world IoT traffic and attack scenarios. Experimental results show that our hybrid framework consistently outperforms baseline methods, in both binary and multi-class intrusion detection tasks. The transformer-GAN achieves a multi-class classification accuracy of 99.67%, with an F1-score of 99.61%, and an area under the curve (AUC) of 99.80% in the CIC-IoT-2023 dataset, and achieves 98.84% accuracy, 98.79% F1-score, and 99.12% AUC on the TON_IoT dataset. The superiority of the proposed model was further validated through statistically significant t-test results, lower execution time compared to baselines, and minimal standard deviation across runs, indicating both efficiency and stability. The proposed framework offers a promising approach for enhancing the security and resilience of next-generation IoT systems. Full article
Show Figures

Graphical abstract

33 pages, 648 KiB  
Review
Impact of EU Laws on AI Adoption in Smart Grids: A Review of Regulatory Barriers, Technological Challenges, and Stakeholder Benefits
by Bo Nørregaard Jørgensen, Saraswathy Shamini Gunasekaran and Zheng Grace Ma
Energies 2025, 18(12), 3002; https://doi.org/10.3390/en18123002 - 6 Jun 2025
Viewed by 895
Abstract
This scoping review examines the evolving landscape of European Union (EU) legislation, as it pertains to the implementation of artificial intelligence (AI) in smart grid systems. By outlining the current regulatory landscape, including the General Data Protection Regulation (GDPR), the EU Artificial Intelligence [...] Read more.
This scoping review examines the evolving landscape of European Union (EU) legislation, as it pertains to the implementation of artificial intelligence (AI) in smart grid systems. By outlining the current regulatory landscape, including the General Data Protection Regulation (GDPR), the EU Artificial Intelligence Act, the EU Data Act, the EU Data Governance Act, the ePrivacy framework, the Network and Information Systems (NIS2) Directive, the EU Cyber Resilience Act, the EU Network Code on Cybersecurity for the electricity sector, and the EU Cybersecurity Act, it highlights both constraints and opportunities for stakeholders, including energy utilities, technology providers, and end-users. The analysis delves into regulatory barriers such as data protection requirements, algorithmic transparency mandates, and liability concerns that can limit the scope and scale of AI deployment. Technological challenges are also addressed, ranging from the integration of distributed energy resources and real-time data processing to cybersecurity and standardization issues. Despite these challenges, this review emphasizes how compliance with EU laws may ultimately boost consumer trust, promote ethical AI usage, and streamline the roll-out of robust, scalable smart grid solutions. The paper further explores stakeholder benefits, including enhanced grid stability, cost reductions through automation, and improved sustainability targets aligned with the EU’s broader energy and climate strategies. By synthesizing these findings, the review offers insights into policy gaps, technological enablers, and collaborative frameworks critical for accelerating AI-driven innovation in the energy sector, helping stakeholders navigate a complex regulatory environment while reaping its potential rewards. Full article
Show Figures

Figure 1

29 pages, 1626 KiB  
Article
Cybersecurity for Analyzing Artificial Intelligence (AI)-Based Assistive Technology and Systems in Digital Health
by Abdullah M. Algarni and Vijey Thayananthan
Systems 2025, 13(6), 439; https://doi.org/10.3390/systems13060439 - 5 Jun 2025
Viewed by 832
Abstract
Assistive technology (AT) is increasingly utilized across various sectors, including digital healthcare and sports education. E-learning plays a vital role in enabling students with special needs, particularly those in remote areas, to access education. However, as the adoption of AI-based AT systems expands, [...] Read more.
Assistive technology (AT) is increasingly utilized across various sectors, including digital healthcare and sports education. E-learning plays a vital role in enabling students with special needs, particularly those in remote areas, to access education. However, as the adoption of AI-based AT systems expands, the associated cybersecurity challenges also grow. This study aims to examine the impact of AI-driven assistive technologies on cybersecurity in digital healthcare applications, with a focus on the potential vulnerabilities these technologies present. Methods: The proposed model focuses on enhancing AI-based AT through the implementation of emerging technologies used for security, risk management strategies, and a robust assessment framework. With these improvements, the AI-based Internet of Things (IoT) plays major roles within the AT. This model addresses the identification and mitigation of cybersecurity risks in AI-based systems, specifically in the context of digital healthcare applications. Results: The findings indicate that the application of the AI-based risk and resilience assessment framework significantly improves the security of AT systems, specifically those supporting e-learning for blind users. The model demonstrated measurable improvements in the robustness of cybersecurity in digital health, particularly in reducing cyber risks for AT users involved in e-learning environments. Conclusions: The proposed model provides a comprehensive approach to securing AI-based AT in digital healthcare applications. By improving the resilience of assistive systems, it minimizes cybersecurity risks for users, specifically blind individuals, and enhances the effectiveness of e-learning in sports education. Full article
(This article belongs to the Section Artificial Intelligence and Digital Systems Engineering)
Show Figures

Figure 1

20 pages, 2304 KiB  
Article
Memory-Driven Forensic Analysis of SQL Server: A Buffer Pool and Page Inspection Approach
by Jiho Shin
Sensors 2025, 25(11), 3512; https://doi.org/10.3390/s25113512 - 2 Jun 2025
Viewed by 678
Abstract
This study proposes a memory-based forensic procedure for real-time recovery of deleted data in Microsoft SQL Server environments. This approach is particularly relevant for sensor-driven and embedded systems—such as those used in IoT gateways and edge computing platforms—where lightweight SQL engines store critical [...] Read more.
This study proposes a memory-based forensic procedure for real-time recovery of deleted data in Microsoft SQL Server environments. This approach is particularly relevant for sensor-driven and embedded systems—such as those used in IoT gateways and edge computing platforms—where lightweight SQL engines store critical operational and measurement data locally and are vulnerable to insider manipulation. Traditional approaches to deleted data recovery have primarily relied on transaction log analysis or static methods involving the examination of physical files such as .mdf and .ldf after taking the database offline. However, these methods face critical limitations in real-time applicability and may miss volatile data that temporarily resides in memory. To address these challenges, this study introduces a methodology that captures key deletion event information through transaction log analysis immediately after data deletion and directly inspects memory-resident pages loaded in the server’s Buffer Pool. By analyzing page structures in the Buffer Pool and cross-referencing them with log data, we establish a memory-driven forensic framework that enables both the recovery and verification of deleted records. In the experimental validation, records were deleted in a live SQL Server environment, and a combination of transaction log analysis and in-memory page inspection allowed for partial or full recovery of the deleted data. This demonstrates the feasibility of real-time forensic analysis without interrupting the operational database. The findings of this research provide a foundational methodology for enhancing the speed and accuracy of digital forensics in time-sensitive scenarios, such as insider threats or cyber intrusion incidents, by enabling prompt and precise recovery of deleted data directly from memory. These capabilities are especially critical in IoT environments, where real-time deletion recovery supports sensor data integrity, forensic traceability, and uninterrupted system resilience. Full article
(This article belongs to the Special Issue Network Security and IoT Security: 2nd Edition)
Show Figures

Figure 1

27 pages, 1612 KiB  
Article
Employing Quantum Entanglement for Real-Time Coordination of Distributed Electric Vehicle Charging Stations: Advancing Grid Efficiency and Stability
by Dawei Wang, Hanqi Dai, Yuan Jin, Zhuoqun Li, Shanna Luo and Xuebin Li
Energies 2025, 18(11), 2917; https://doi.org/10.3390/en18112917 - 2 Jun 2025
Viewed by 482
Abstract
The widespread deployment of electric vehicles (EVs) has introduced substantial challenges to electricity pricing, grid stability, and renewable energy integration. This paper presents the first real-time quantum-enhanced electricity pricing framework for large-scale EV charging networks, marking a significant departure from existing approaches based [...] Read more.
The widespread deployment of electric vehicles (EVs) has introduced substantial challenges to electricity pricing, grid stability, and renewable energy integration. This paper presents the first real-time quantum-enhanced electricity pricing framework for large-scale EV charging networks, marking a significant departure from existing approaches based on mixed-integer programming (MILP) and deep reinforcement learning (DRL). The proposed framework incorporates renewable intermittency, demand elasticity, and infrastructure constraints within a high-dimensional optimization model. The objective is to dynamically determine spatiotemporal electricity prices that reduce system peak load, improve renewable utilization, and minimize user charging costs. A rigorous mathematical formulation is developed, integrating over 40 system-level constraints, including power balance, transmission limits, renewable curtailment, carbon targets, voltage regulation, demand-side flexibility, social participation, and cyber-resilience. Real-time electricity prices are treated as dynamic decision variables influenced by station utilization, elasticity response curves, and the marginal cost of renewable and grid electricity. The model is solved across 96 time intervals using a quantum-classical hybrid method, with benchmark comparisons against MILP and DRL baselines. A comprehensive case study is conducted on a 500-station EV network serving 10,000 vehicles, coupled with a modified IEEE 118-bus grid and 800 MW of variable renewable energy. Historical charging data with ±12% stochastic demand variation and real-world solar/wind profiles are used to simulate realistic conditions. Results show that the proposed framework achieves a 23.4% average peak load reduction per station, a 17.9% gain in renewable utilization, and up to 30% user cost savings compared to flat-rate pricing. Network congestion is mitigated at over 90% of high-traffic stations. Pricing trajectories align low-price windows with high-renewable periods and off-peak hours, enabling synchronized load shifting and enhanced flexibility. Visual analytics using 3D surface plots and disaggregated bar charts confirm structured demand-price interactions and smooth, stable price evolution. These findings validate the potential of quantum-enhanced optimization for scalable, clean, and adaptive EV charging coordination in renewable-rich grid environments. Full article
Show Figures

Figure 1

54 pages, 17044 KiB  
Review
Perspectives and Research Challenges in Wireless Communications Hardware for the Future Internet and Its Applications Services
by Dimitrios G. Arnaoutoglou, Tzichat M. Empliouk, Theodoros N. F. Kaifas, Constantinos L. Zekios and George A. Kyriacou
Future Internet 2025, 17(6), 249; https://doi.org/10.3390/fi17060249 - 31 May 2025
Viewed by 927
Abstract
The transition from 5G to 6G wireless systems introduces new challenges at the physical layer, including the need for higher frequency operations, massive MIMO deployment, advanced beamforming techniques, and sustainable energy harvesting mechanisms. A plethora of feature articles, review and white papers, and [...] Read more.
The transition from 5G to 6G wireless systems introduces new challenges at the physical layer, including the need for higher frequency operations, massive MIMO deployment, advanced beamforming techniques, and sustainable energy harvesting mechanisms. A plethora of feature articles, review and white papers, and roadmaps elaborate on the perspectives and research challenges of wireless systems, in general, including both unified physical and cyber space. Hence, this paper presents a comprehensive review of the technological challenges and recent advancements in wireless communication hardware that underpin the development of next-generation networks, particularly 6G. Emphasizing the physical layer, the study explores critical enabling technologies including beamforming, massive MIMO, reconfigurable intelligent surfaces (RIS), millimeter-wave (mmWave) and terahertz (THz) communications, wireless power transfer, and energy harvesting. These technologies are analyzed in terms of their functional roles, implementation challenges, and integration into future wireless infrastructure. Beyond traditional physical layer components, the paper also discusses the role of reconfigurable RF front-ends, innovative antenna architectures, and user-end devices that contribute to the adaptability and efficiency of emerging communication systems. In addition, the inclusion of application-driven paradigms such as digital twins highlights how new use cases are shaping design requirements and pushing the boundaries of hardware capabilities. By linking foundational physical-layer technologies with evolving application demands, this work provides a holistic perspective aimed at guiding future research directions and informing the design of scalable, energy-efficient, and resilient wireless communication platforms for the Future Internet. Specifically, we first try to identify the demands and, in turn, explore existing or emerging technologies that have the potential to meet these needs. Especially, there will be an extended reference about the state-of-the-art antennas for massive MIMO terrestrial and non-terrestrial networks. Full article
(This article belongs to the Special Issue Joint Design and Integration in Smart IoT Systems)
Show Figures

Figure 1

Back to TopTop