Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (660)

Search Parameters:
Keywords = real-time trace

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 549 KB  
Article
Catch Me If You Can: Rogue AI Detection and Correction at Scale
by Fatemeh Stodt, Jan Stodt, Mohammed Alshawki, Javad Salimi Sratakhti and Christoph Reich
Electronics 2025, 14(20), 4122; https://doi.org/10.3390/electronics14204122 - 21 Oct 2025
Abstract
Modern AI systems can strategically misreport information when incentives diverge from truthfulness, posing risks for oversight and deployment. Prior studies often examine this behavior within a single paradigm; systematic, cross-architecture evidence under a unified protocol has been limited. We introduce the Strategy Elicitation [...] Read more.
Modern AI systems can strategically misreport information when incentives diverge from truthfulness, posing risks for oversight and deployment. Prior studies often examine this behavior within a single paradigm; systematic, cross-architecture evidence under a unified protocol has been limited. We introduce the Strategy Elicitation Battery (SEB), a standardized probe suite for measuring deceptive reporting across large language models (LLMs), reinforcement-learning agents, vision-only classifiers, multimodal encoders, state-space models, and diffusion models. SEB uses Bayesian inference tasks with persona-controlled instructions, schema-constrained outputs, deterministic decoding where supported, and a probe mix (near-threshold, repeats, neutralized, cross-checks). Estimates use clustered bootstrap intervals, and significance is assessed with a logistic regression by architecture; a mixed-effects analysis is planned once the per-round agent/episode traces are exported. On the latest pre-correction runs, SEB shows a consistent cross-architecture pattern in deception rates: ViT 80.0%, CLIP 15.0%, Mamba 10.0%, RL agents 10.0%, Stable Diffusion 10.0%, and LLMs 5.0% (20 scenarios/architecture). A logistic regression on per-scenario flags finds a significant overall architecture effect (likelihood-ratio test vs. intercept-only: χ2(5)=41.56, p=7.22×108). Holm-adjusted contrasts indicate ViT is significantly higher than all other architectures in this snapshot; the remaining pairs are not significant. Post-correction acceptance decisions are evaluated separately using residual deception and override rates under SEB-Correct. Latency varies by architecture (sub-second to minutes), enabling pre-deployment screening broadly and real-time auditing for low-latency classes. Results indicate that SEB-Detect deception flags are not confined to any one paradigm, that distinct architectures can converge to similar levels under a common interface, and that reporting interfaces and incentive framing are central levers for mitigation. We operationalize “deception” as reward-sensitive misreport flags, and we separate detection from intervention via a correction wrapper (SEB-Correct), supporting principled acceptance decisions for deployment. Full article
Show Figures

Figure 1

21 pages, 1366 KB  
Article
Robotic and On-Flow Solid Phase Extraction Coupled with LC-MS/MS for Simultaneous Determination of 16 PPCPs: Real-Time Monitoring of Wastewater Effluent in Korea
by Sook-Hyun Nam, Homin Kye, Juwon Lee, Eunju Kim, Jae-Wuk Koo, Jeongbeen Park, Yonghyun Shin, Jonggul Lee and Tae-Mun Hwang
Toxics 2025, 13(10), 899; https://doi.org/10.3390/toxics13100899 - 20 Oct 2025
Abstract
Pharmaceuticals and personal care products (PPCPs) are recognized as emerging contaminants of concern, even at ultra-trace concentrations. However, the current detection systems are prohibitively expensive and typically rely on labor-intensive, lab-based workflows that lack automation in sample pretreatment. In this study, we developed [...] Read more.
Pharmaceuticals and personal care products (PPCPs) are recognized as emerging contaminants of concern, even at ultra-trace concentrations. However, the current detection systems are prohibitively expensive and typically rely on labor-intensive, lab-based workflows that lack automation in sample pretreatment. In this study, we developed a robotic and on-flow solid-phase extraction (ROF-SPE) system, fully integrated with online liquid chromatography-tandem mass spectrometry (LC-MS/MS), for the on-site and real-time monitoring of 16 PPCPs in wastewater effluent. The system automates the entire pretreatment workflow—including sample collection, filtration, pH adjustment, solid-phase extraction, and injection—prior to seamless coupling with LC–MS/MS analysis. The optimized pretreatment parameters (pH 7 and 10, 12 mL wash volume, 9 mL elution volume) were selected for analytical efficiency and cost-effectiveness. Compared with conventional offline SPE methods (~370 min), the total analysis time was reduced to 80 min (78.4% reduction), and parallel automation significantly enhanced the throughput. The system was capable of quantifying target analytes at concentrations as low as 0.1 ng/L. Among the 16 PPCPs monitored at a municipal wastewater treatment plant in South Korea, only sulfamethazine and ranitidine were not detected. Compounds such as iopromide, caffeine, and paraxanthine were detected at high concentrations, and seasonal variation patterns were also observed This study demonstrates the feasibility of a fully automated and on-site SPE pretreatment system for ultra-trace environmental analysis and presents a practical solution for the real-time monitoring of contaminants in remote areas. Full article
Show Figures

Graphical abstract

36 pages, 1471 KB  
Review
Next-Gen Healthcare Devices: Evolution of MEMS and BioMEMS in the Era of the Internet of Bodies for Personalized Medicine
by Maria-Roxana Marinescu, Octavian Narcis Ionescu, Cristina Ionela Pachiu, Miron Adrian Dinescu, Raluca Muller and Mirela Petruța Șuchea
Micromachines 2025, 16(10), 1182; https://doi.org/10.3390/mi16101182 - 19 Oct 2025
Viewed by 316
Abstract
The rapid evolution of healthcare technology is being driven by advancements in Micro-Electro-Mechanical Systems (MEMS), BioMEMS (Biological MEMS), and the expanding concept of the Internet of Bodies (IoB). This review explores the convergence of these three domains and their transformative impact on personalized [...] Read more.
The rapid evolution of healthcare technology is being driven by advancements in Micro-Electro-Mechanical Systems (MEMS), BioMEMS (Biological MEMS), and the expanding concept of the Internet of Bodies (IoB). This review explores the convergence of these three domains and their transformative impact on personalized medicine (PM), with a focus on smart, connected biomedical devices. Starting from the historical development of MEMS for medical sensing and diagnostics, the review traces the emergence of BioMEMS as biocompatible, minimally invasive solutions for continuous monitoring and real-time intervention. The integration of such devices within the IoB ecosystem enables data-driven, remote, and predictive healthcare, offering tailored diagnostics and treatment for chronic and acute conditions alike. The paper classifies IoB-associated technologies into non-invasive, invasive, and incorporated devices, reviewing wearable systems such as smart bracelets, e-tattoos, and smart footwear, as well as internal devices including implantable and ingestible. Alongside these opportunities, significant challenges persist, particularly in device biocompatibility, data interoperability, cybersecurity, and ethical regulation. By synthesizing recent advances and critical perspectives, this review aims to provide a comprehensive understanding of the current landscape, clinical potential, and future directions of MEMS, BioMEMS, and IoB-enabled personalized healthcare. Full article
Show Figures

Figure 1

28 pages, 8791 KB  
Article
CRSensor: A Synchronized and Impact-Aware Traceability Framework for Business Application Development
by Soojin Park
Appl. Sci. 2025, 15(20), 11083; https://doi.org/10.3390/app152011083 - 16 Oct 2025
Viewed by 109
Abstract
To enable effective change impact management in business applications, robust requirements traceability is essential. However, manual approaches are inefficient and prone to errors. While the prior Model-Driven Engineering (MDE)-based research, including the author’s theoretical models, established the principles of traceability, these approaches lacked [...] Read more.
To enable effective change impact management in business applications, robust requirements traceability is essential. However, manual approaches are inefficient and prone to errors. While the prior Model-Driven Engineering (MDE)-based research, including the author’s theoretical models, established the principles of traceability, these approaches lacked decisive quantitative validation using metrics such as precision and recall, thereby limiting their real-world applicability. This paper addresses these limitations by introducing the CRSensor framework, which integrates the real-time automated trace link generation and dynamic refinement of the developer model. This approach enhances the reliability and completeness of organizational impact analysis, resolving key weaknesses of conventional link recovery methods. Notably, CRSensor maintains structural consistency throughout the lifecycle, overcoming reliability limitations often found in traditional information retrieval (IR)/machine learning (ML)-based traceability solutions. Empirical evaluation demonstrates that CRSensor achieves an average trace link setting performance with a precision of 0.95, a recall of 0.98, and an auto-generation rate of 80%. These results validate both the industrial applicability and the quantitative rigor of the proposed framework, paving the way for broader practical adoption. Full article
Show Figures

Figure 1

24 pages, 13118 KB  
Article
A Workflow for Urban Heritage Digitization: From UAV Photogrammetry to Immersive VR Interaction with Multi-Layer Evaluation
by Chengyun Zhang, Guiye Lin, Yuyang Peng and Yingwen Yu
Drones 2025, 9(10), 716; https://doi.org/10.3390/drones9100716 - 16 Oct 2025
Viewed by 379
Abstract
Urban heritage documentation often separates 3D data acquisition from immersive interaction, limiting both accuracy and user impact. This study develops and validates an end-to-end workflow that integrates UAV photogrammetry with terrestrial LiDAR and deploys the fused model in a VR environment. Applied to [...] Read more.
Urban heritage documentation often separates 3D data acquisition from immersive interaction, limiting both accuracy and user impact. This study develops and validates an end-to-end workflow that integrates UAV photogrammetry with terrestrial LiDAR and deploys the fused model in a VR environment. Applied to Piazza Vittorio Emanuele II in Rovigo, Italy, the approach achieves centimetre-level registration, completes roofs and upper façades that ground scanning alone cannot capture, and produces stable, high-fidelity assets suitable for real-time interaction. Effectiveness is assessed through a three-layer evaluation framework encompassing vision, behavior, and cognition. Eye-tracking heatmaps and scanpaths show that attention shifts from dispersed viewing to concentrated focus on landmarks and panels. Locomotion traces reveal a transition from diffuse roaming to edge-anchored strategies, with stronger reliance on low-visibility zones for spatial judgment. Post-VR interviews confirm improved spatial comprehension, stronger recognition of cultural values, and enhanced conservation intentions. The results demonstrate that UAV-enabled completeness directly influences how users perceive, navigate, and interpret heritage spaces in VR. The workflow is cost-effective, replicable, and transferable, offering a practical model for under-resourced heritage sites. More broadly, it provides a methodological template for linking drone-based data acquisition to measurable cognitive and cultural outcomes in immersive heritage applications. Full article
(This article belongs to the Special Issue Implementation of UAV Systems for Cultural Heritage)
Show Figures

Figure 1

24 pages, 6626 KB  
Article
Harnessing GPS Spatiotemporal Big Data to Enhance Visitor Experience and Sustainable Management of UNESCO Heritage Sites: A Case Study of Mount Huangshan, China
by Jianping Sun, Shi Chen, Yinlan Huang, Huifang Rong and Qiong Li
ISPRS Int. J. Geo-Inf. 2025, 14(10), 396; https://doi.org/10.3390/ijgi14100396 - 12 Oct 2025
Viewed by 496
Abstract
In the era of big data, the rapid proliferation of user-generated content enriched with geolocations offers new perspectives and datasets for probing the spatiotemporal dynamics of tourist mobility. Mining large-scale geospatial traces has become central to tourism geography: it reveals preferences for attractions [...] Read more.
In the era of big data, the rapid proliferation of user-generated content enriched with geolocations offers new perspectives and datasets for probing the spatiotemporal dynamics of tourist mobility. Mining large-scale geospatial traces has become central to tourism geography: it reveals preferences for attractions and routes to enable intelligent recommendation, enhance visitor experience, and advance smart tourism, while also informing spatial planning, crowd management, and sustainable destination development. Using Mount Huangshan—a UNESCO World Cultural and Natural Heritage site—as a case study, we integrate GPS trajectories and geo-tagged photographs from 2017–2023. We apply a Density-Field Hotspot Detector (DF-HD), a Space–Time Cube (STC), and spatial gridding to analyze behavior from temporal, spatial, and fully spatiotemporal perspectives. Results show a characteristic “double-peak, double-trough” seasonal pattern in the number of GPS tracks, cumulative track length, and geo-tagged photos. Tourist behavior exhibits pronounced elevation dependence, with clear vertical differentiation. DF-HD efficiently delineates hierarchical hotspot areas and visitor interest zones, providing actionable evidence for demand-responsive crowd diversion. By integrating sequential time slices with geography in a 3D framework, the STC exposes dynamic spatiotemporal associations and evolutionary regularities in visitor flows, supporting real-time crowd diagnosis and optimized spatial resource allocation. Comparative findings further confirm that Huangshan’s seasonal intensity is significantly lower than previously reported, while the high agreement between trajectory density and gridded photos clarifies the multi-tier clustering of route popularity. These insights furnish a scientific basis for designing secondary tour loops, alleviating pressure on core areas, and charting an effective pathway toward internal structural optimization and sustainable development of the Mount Huangshan Scenic Area. Full article
(This article belongs to the Special Issue Spatial Information for Improved Living Spaces)
Show Figures

Figure 1

12 pages, 1169 KB  
Article
Research on Space Object Origin Tracing Approach Using Density Peak Clustering and Distance Feature Optimization
by Jinyan Xue, Yasheng Zhang, Xuefeng Tao and Shuailong Zhao
Appl. Sci. 2025, 15(20), 10943; https://doi.org/10.3390/app152010943 - 11 Oct 2025
Viewed by 188
Abstract
The exponential growth of space objects in near-Earth and geostationary orbits has posed severe threats to space environment safety, with debris clouds from spacecraft breakup events being a critical concern. Debris cloud tracing, as a key technology for locating breakup points, faces dual [...] Read more.
The exponential growth of space objects in near-Earth and geostationary orbits has posed severe threats to space environment safety, with debris clouds from spacecraft breakup events being a critical concern. Debris cloud tracing, as a key technology for locating breakup points, faces dual challenges of insufficient precision in analytical methods and excessive computational load in numerical methods. To balance traceability accuracy with computational efficiency, this paper proposes a breakup time determination method integrating a clustering algorithm and the minimization of average relative distance. The method first calculates the average relative distance between fragment pairs and preliminarily estimates the breakup epoch using a golden section step-size optimization strategy. Subsequently, the density peak clustering (DPC) algorithm is introduced to eliminate abnormal fragments. The breakup epoch is then refined based on the cleansed fragment dataset, achieving high-precision localization. Validation through simulations of real breakup events demonstrates that this method significantly improves localization accuracy. It establishes a highly reliable temporal benchmark for space collision tracing, debris diffusion prediction, and orbital safety management. Full article
Show Figures

Figure 1

21 pages, 5810 KB  
Article
Investigating Seasonal Water Quality Dynamics in Humid, Subtropical Louisiana Facultative Waste Stabilization Ponds
by Mason Marcantel, Mahathir Bappy and Michael Hayes
Water 2025, 17(20), 2936; https://doi.org/10.3390/w17202936 - 11 Oct 2025
Viewed by 337
Abstract
Waste stabilization ponds (WSPs) in humid, subtropical climates rely on stable temperatures and mechanical aeration to promote microbial activity. These critical infrastructures can lack operational resources to ensure efficient treatment, which can impact downstream communities. This study aims to use remote water quality [...] Read more.
Waste stabilization ponds (WSPs) in humid, subtropical climates rely on stable temperatures and mechanical aeration to promote microbial activity. These critical infrastructures can lack operational resources to ensure efficient treatment, which can impact downstream communities. This study aims to use remote water quality sensor data to establish trends in a yearly dataset and correlate various water quality parameters for simplistic identification of pond health. A facultative WSP was monitored in two stages: the primary settling over a period of 14 months to evaluate partially treated water, and the secondary treatment pond for a period of 11 months to monitor final stage water quality parameters. A statistical analysis was performed on the measured parameters (dissolved oxygen, temperature, conductivity, pH, turbidity, nitrate, and ammonium) to establish a comprehensive yearly, seasonal, and monthly dataset to show fluctuations in water parameter correlations. Standard relationships in dissolved oxygen, conductivity, pH, and temperature were traced during the seasonal fluctuations, which provided insight into nitrogen processing by microbial communities. During this study, the summer period showed the most variability, specifically a deviation in the dissolved oxygen and temperature relationship from a yearly moderate negative correlation (−0.593) to a moderate positive correlation (0.459), indicating a direct relationship. The secondary treatment pond data showed more nitrogen species correlation, which can indicate final cycling during seasonal transitions. Understanding pond dynamics can lead to impactful, proactive operational decisions to address pond imbalance or chemical dosing for final treatment. By establishing parameter correlations, facilities with WSPs can strategically integrate sensor networks for real-time pond health and treatment efficiency monitoring during seasonal fluctuations. Full article
(This article belongs to the Section Wastewater Treatment and Reuse)
Show Figures

Figure 1

27 pages, 3153 KB  
Review
Evolutionary Insight into Fatal Human Coronaviruses (hCoVs) with a Focus on Circulating SARS-CoV-2 Variants Under Monitoring (VUMs)
by Mohammad Asrar Izhari, Fahad Alghamdi, Essa Ajmi Alodeani, Ahmad A. Salem, Ahamad H. A. Almontasheri, Daifallah M. M. Dardari, Mansour A. A. Hadadi, Ahmed R. A. Gosady, Wael A. Alghamdi, Bakheet A. Alzahrani and Bandar M. A. Alzahrani
Biomedicines 2025, 13(10), 2450; https://doi.org/10.3390/biomedicines13102450 - 8 Oct 2025
Viewed by 647
Abstract
The breach of an interspecies barrier by RNA viruses has facilitated the emergence of lethal hCoVs, particularly SARS-CoV-2, resulting in significant socioeconomic setbacks and public health risks globally in recent years. Moreover, the high evolutionary plasticity of hCoVs has led to the continuous [...] Read more.
The breach of an interspecies barrier by RNA viruses has facilitated the emergence of lethal hCoVs, particularly SARS-CoV-2, resulting in significant socioeconomic setbacks and public health risks globally in recent years. Moreover, the high evolutionary plasticity of hCoVs has led to the continuous emergence of diverse variants, complicating clinical management and public health responses. Studying the evolutionary trajectory of hCoVs, which provides a molecular roadmap for understanding viruses’ adaptation, tissue tropism, spread, virulence, and immune evasion, is crucial for addressing the challenges of zoonotic spillover of viruses. Tracing the evolutionary trajectory of lethal hCoVs provides essential genomic insights required for risk stratification, variant/sub-variant classification, preparedness for outbreaks and pandemics, and the identification of critical viral elements for vaccine and therapeutic development. Therefore, this review examines the evolutionary landscape of the three known lethal hCoVs, presenting a focused narrative on SARS-CoV-2 variants under monitoring (VUMs) as of May 2025. Using advanced bioinformatics approaches and data visualization, the review highlights key spike protein substitutions, particularly within the receptor-binding domain (RBD), which drive transmissibility, immune escape, and potential resistance to therapeutics. The article highlights the importance of real-time genomic surveillance and intervention strategies in mitigating emerging variant/sub-variant risks within the ongoing COVID-19 landscape. Full article
Show Figures

Figure 1

14 pages, 786 KB  
Article
Typing of Yersinia pestis in Challenging Forensic Samples Through Targeted Next-Generation Sequencing of Multilocus Variable Number Tandem Repeat Regions
by Hyeongseok Yun, Seung-Ho Lee, Se Hun Gu, Seung Hyun Lim and Dong Hyun Song
Microorganisms 2025, 13(10), 2320; https://doi.org/10.3390/microorganisms13102320 - 7 Oct 2025
Viewed by 297
Abstract
Microbial forensics involves analyzing biological evidence to evaluate weaponized microorganisms or their toxins. This study aimed to detect and type Yersinia pestis from four simulated forensic samples—human plasma diluted in phosphate-buffered saline (#24-2), tomato juice (#24-5), grape juice (#24-8), and a surgical mask [...] Read more.
Microbial forensics involves analyzing biological evidence to evaluate weaponized microorganisms or their toxins. This study aimed to detect and type Yersinia pestis from four simulated forensic samples—human plasma diluted in phosphate-buffered saline (#24-2), tomato juice (#24-5), grape juice (#24-8), and a surgical mask (#24-10). Notably, samples #24-10 may have contained live bacteria other than Y. pestis. A real-time polymerase chain reaction confirmed the presence of Y. pestis in all samples; however, whole-genome sequencing (WGS) coverage of the Y. pestis chromosome ranged from 0.46% to 97.1%, largely due to host DNA interference and low abundance. To address these limitations and enable strain-level identification, we designed a hybridization-based target enrichment approach focused on multilocus variable number tandem repeat analysis (MLVA). Next-generation sequencing (NGS) using whole-genome amplification revealed that the accuracy of the 25 MLVA profiles of Y. pestis for samples #24-2, #24-5, #24-8, and #24-10 was 4%, 100%, 52%, and 0%, respectively. However, all samples showed 100% accuracy with target-enriched NGS, confirming they all belong to the same strain. These findings demonstrate that a targeted enrichment strategy for MLVA loci can overcome common obstacles in microbial forensics, particularly when working with trace or degraded samples where conventional WGS proves challenging. Full article
Show Figures

Figure 1

27 pages, 2189 KB  
Article
Miss-Triggered Content Cache Replacement Under Partial Observability: Transformer-Decoder Q-Learning
by Hakho Kim, Teh-Jen Sun and Eui-Nam Huh
Mathematics 2025, 13(19), 3217; https://doi.org/10.3390/math13193217 - 7 Oct 2025
Viewed by 226
Abstract
Content delivery networks (CDNs) face steadily rising, uneven demand, straining heuristic cache replacement. Reinforcement learning (RL) is promising, but most work assumes a fully observable Markov Decision Process (MDP), unrealistic under delayed, partial, and noisy signals. We model cache replacement as a Partially [...] Read more.
Content delivery networks (CDNs) face steadily rising, uneven demand, straining heuristic cache replacement. Reinforcement learning (RL) is promising, but most work assumes a fully observable Markov Decision Process (MDP), unrealistic under delayed, partial, and noisy signals. We model cache replacement as a Partially Observable MDP (POMDP) and present the Miss-Triggered Cache Transformer (MTCT), a Transformer-decoder Q-learning agent that encodes recent histories with self-attention. MTCT invokes its policy only on cache misses to align compute with informative events and uses a delayed-hit reward to propagate information from hits. A compact, rank-based action set (12 actions by default) captures popularity–recency trade-offs with complexity independent of cache capacity. We evaluate MTCT on a real trace (MovieLens) and two synthetic workloads (Mandelbrot–Zipf, Pareto) against Adaptive Replacement Cache (ARC), Windowed TinyLFU (W-TinyLFU), classical heuristics, and Double Deep Q-Network (DDQN). MTCT achieves the best or statistically comparable cache-hit rates on most cache sizes; e.g., on MovieLens at M=600, it reaches 0.4703 (DDQN 0.4436, ARC 0.4513). Miss-triggered inference also lowers mean wall-clock time per episode; Transformer inference is well suited to modern hardware acceleration. Ablations support CL=50 and show that finer action grids improve stability and final accuracy. Full article
(This article belongs to the Section E1: Mathematics and Computer Science)
Show Figures

Figure 1

13 pages, 748 KB  
Article
Lattice-Based Identity Authentication Protocol with Enhanced Privacy and Scalability for Vehicular Ad Hoc Networks
by Kuo-Yu Tsai and Ying-Hsuan Yang
Future Internet 2025, 17(10), 458; https://doi.org/10.3390/fi17100458 - 7 Oct 2025
Viewed by 394
Abstract
Vehicular ad hoc networks (VANETs) demand authentication mechanisms that are both secure and privacy-preserving, particularly in light of emerging quantum-era threats. In this work, we propose a lattice-based identity authentication protocol that leverages pseudo-IDs to safeguard user privacy, while allowing the Trusted Authority [...] Read more.
Vehicular ad hoc networks (VANETs) demand authentication mechanisms that are both secure and privacy-preserving, particularly in light of emerging quantum-era threats. In this work, we propose a lattice-based identity authentication protocol that leverages pseudo-IDs to safeguard user privacy, while allowing the Trusted Authority (TA) to trace misbehaving vehicles when necessary. Compared with existing approaches, the proposed scheme strengthens accountability, improves scalability, and offers resistance against quantum attacks. A comprehensive complexity analysis is presented, addressing computational, communication, and storage overhead. Analysis results under practical parameter settings demonstrate that the protocol delivers robust security with manageable overhead, maintaining authentication latency within the real-time requirements of VANET applications. Full article
Show Figures

Figure 1

14 pages, 2238 KB  
Article
Functional Biopolymer-Stabilized Silver Nanoparticles on Glassy Carbon: A Voltammetric Sensor for Trace Thallium(I) Detection
by Bożena Karbowska, Maja Giera, Anna Modrzejewska-Sikorska and Emilia Konował
Int. J. Mol. Sci. 2025, 26(19), 9658; https://doi.org/10.3390/ijms26199658 - 3 Oct 2025
Viewed by 225
Abstract
Thallium is a soft metal with a grey or silvery hue. It commonly occurs in two oxidation states in chemical compounds: Tl+ and Tl3+. Thermodynamically, Tl+ is significantly more stable and typically represents the dominant form of thallium in [...] Read more.
Thallium is a soft metal with a grey or silvery hue. It commonly occurs in two oxidation states in chemical compounds: Tl+ and Tl3+. Thermodynamically, Tl+ is significantly more stable and typically represents the dominant form of thallium in environmental systems. However, in this chemical form, thallium remains highly toxic. This study focuses on the modification of a glassy carbon electrode (GCE) with silver nanostructures stabilised by potato starch derivatives. The modified electrode (GCE/AgNPs-E1451) was used for the determination of trace amounts of thallium ions using anodic stripping voltammetry. Emphasis was placed on assessing the effect of surface modification on key electrochemical performance parameters of the electrode. Measurements were carried out in a base electrolyte (EDTA) and in a real soil sample collected from Bali. The stripping peak current of thallium exhibited linearity over the concentration range from 19 to 410 ppb (9.31 × 10−8 to 2.009 × 10−6 mol/dm3). The calculated limit of detection (LOD) was 18.8 ppb (9.21 × 10−8 mol/dm3), while the limit of quantification (LOQ), corresponded to 56.4 ppb (2.76 × 10−7 mol/dm3). The GCE/AgNPs-E1451 electrode demonstrates several significant advantages, including a wide detection range, reduced analysis time due to the elimination of time-consuming pre-concentration steps, and non-toxic operation compared to mercury-based electrodes. Full article
(This article belongs to the Special Issue New Advances in Metal Nanoparticles)
Show Figures

Figure 1

15 pages, 479 KB  
Article
Security of Quantum Key Distribution with One-Time-Pad-Protected Error Correction and Its Performance Benefits
by Roman Novak
Entropy 2025, 27(10), 1032; https://doi.org/10.3390/e27101032 - 1 Oct 2025
Viewed by 318
Abstract
In quantum key distribution (QKD), public discussion over the authenticated classical channel inevitably leaks information about the raw key to a potential adversary, which must later be mitigated by privacy amplification. To limit this leakage, a one-time pad (OTP) has been proposed to [...] Read more.
In quantum key distribution (QKD), public discussion over the authenticated classical channel inevitably leaks information about the raw key to a potential adversary, which must later be mitigated by privacy amplification. To limit this leakage, a one-time pad (OTP) has been proposed to protect message exchanges in various settings. Building on the security proof of Tomamichel and Leverrier, which is based on a non-asymptotic framework and considers the effects of finite resources, we extend the analysis to the OTP-protected scheme. We show that when the OTP key is drawn from the entropy pool of the same QKD session, the achievable quantum key rate is identical to that of the reference protocol with unprotected error-correction exchange. This equivalence holds for a fixed security level, defined via the diamond distance between the real and ideal protocols modeled as completely positive trace-preserving maps. At the same time, the proposed approach reduces the computational requirements: for non-interactive low-density parity-check codes, the encoding problem size is reduced by the square of the syndrome length, while privacy amplification requires less compression. The technique preserves security, avoids the use of QKD keys between sessions, and has the potential to improve performance. Full article
(This article belongs to the Section Quantum Information)
Show Figures

Figure 1

19 pages, 1182 KB  
Article
HGAA: A Heterogeneous Graph Adaptive Augmentation Method for Asymmetric Datasets
by Hongbo Zhao, Wei Liu, Congming Gao, Weining Shi, Zhihong Zhang and Jianfei Chen
Symmetry 2025, 17(10), 1623; https://doi.org/10.3390/sym17101623 - 1 Oct 2025
Viewed by 281
Abstract
Edge intelligence plays an increasingly vital role in ensuring the reliability of distributed microservice-based applications, which are widely used in domains such as e-commerce, industrial IoT, and cloud-edge collaborative platforms. However, anomaly detection in these systems encounters a critical challenge: labeled anomaly data [...] Read more.
Edge intelligence plays an increasingly vital role in ensuring the reliability of distributed microservice-based applications, which are widely used in domains such as e-commerce, industrial IoT, and cloud-edge collaborative platforms. However, anomaly detection in these systems encounters a critical challenge: labeled anomaly data are scarce. This scarcity leads to severe class asymmetry and compromised detection performance, particularly under the resource constraints of edge environments. Recent approaches based on Graph Neural Networks (GNNs)—often integrated with DeepSVDD and regularization techniques—have shown potential, but they rarely address this asymmetry in an adaptive, scenario-specific way. This work proposes Heterogeneous Graph Adaptive Augmentation (HGAA), a framework tailored for edge intelligence scenarios. HGAA dynamically optimizes graph data augmentation by leveraging feedback from online anomaly detection. To enhance detection accuracy while adhering to resource constraints, the framework incorporates a selective bias toward underrepresented anomaly types. It uses knowledge distillation to model dataset-dependent distributions and adaptively adjusts augmentation probabilities, thus avoiding excessive computational overhead in edge environments. Additionally, a dynamic adjustment mechanism evaluates augmentation success rates in real time, refining the selection processes to maintain model robustness. Experiments were conducted on two real-world datasets (TraceLog and FlowGraph) under simulated edge scenarios. Results show that HGAA consistently outperforms competitive baseline methods. Specifically, compared with the best non-adaptive augmentation strategies, HGAA achieves an average improvement of 4.5% in AUC and 4.6% in AP. Even larger gains are observed in challenging cases: for example, when using the HGT model on the TraceLog dataset, AUC improves by 14.6% and AP by 18.1%. Beyond accuracy, HGAA also significantly enhances efficiency: compared with filter-based methods, training time is reduced by up to 71% on TraceLog and 8.6% on FlowGraph, confirming its suitability for resource-constrained edge environments. These results highlight the potential of adaptive, edge-aware augmentation techniques in improving microservice anomaly detection within heterogeneous, resource-limited environments. Full article
(This article belongs to the Special Issue Symmetry and Asymmetry in Embedded Systems)
Show Figures

Figure 1

Back to TopTop