Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (241)

Search Parameters:
Keywords = data-centric communications

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 471 KB  
Article
Harvest-Now, Decrypt-Later: A Temporal Cybersecurity Risk in the Quantum Transition
by Francis Kagai, Philip Branch, Jason But and Rebecca Allen
Telecom 2025, 6(4), 100; https://doi.org/10.3390/telecom6040100 - 18 Dec 2025
Abstract
Telecommunication infrastructures rely on cryptographic protocols designed for long-term confidentiality, yet data exchanged today faces future exposure when adversaries acquire quantum or large-scale computational capabilities. This harvest-now, decrypt-later (HNDL) threat transforms persistent communication records into time-dependent vulnerabilities. We model HNDL as a temporal [...] Read more.
Telecommunication infrastructures rely on cryptographic protocols designed for long-term confidentiality, yet data exchanged today faces future exposure when adversaries acquire quantum or large-scale computational capabilities. This harvest-now, decrypt-later (HNDL) threat transforms persistent communication records into time-dependent vulnerabilities. We model HNDL as a temporal cybersecurity risk, formalizing the adversarial process of deferred decryption and quantifying its impact across sectors with varying confidentiality requirements. Our framework evaluates how delayed post-quantum cryptography (PQC) migration amplifies exposure and how hybrid key exchange and forward-secure mechanisms mitigate it. Results show that high-retention sectors such as satellite and health networks face exposure windows extending decades under delayed PQC adoption, while hybrid and forward-secure approaches reduce this risk horizon by over two-thirds. We demonstrate that temporal exposure is a measurable function of data longevity and migration readiness, introducing a network-centric model linking quantum vulnerability to communication performance and governance. Our findings underscore the urgent need for crypto-agile infrastructures that maintain confidentiality as a continuous assurance process throughout the quantum transition. Full article
(This article belongs to the Special Issue Emerging Technologies in Communications and Machine Learning)
Show Figures

Figure 1

27 pages, 9001 KB  
Article
The Research on a Collaborative Management Model for Multi-Source Heterogeneous Data Based on OPC Communication
by Jiashen Tian, Cheng Shang, Tianfei Ren, Zhan Li, Eming Zhang, Jing Yang and Mingjun He
Sensors 2025, 25(24), 7517; https://doi.org/10.3390/s25247517 - 10 Dec 2025
Viewed by 304
Abstract
Effectively managing multi-source heterogeneous data remains a critical challenge in distributed cyber-physical systems (CPS). To address this, we present a novel and edge-centric computing framework integrating four key technological innovations. Firstly, a hybrid OPC communication stack seamlessly combines Client/Server, Publish/Subscribe, and P2P paradigms, [...] Read more.
Effectively managing multi-source heterogeneous data remains a critical challenge in distributed cyber-physical systems (CPS). To address this, we present a novel and edge-centric computing framework integrating four key technological innovations. Firstly, a hybrid OPC communication stack seamlessly combines Client/Server, Publish/Subscribe, and P2P paradigms, enabling scalable interoperability across devices, edge nodes, and the cloud. Secondly, an event-triggered adaptive Kalman filter is introduced; it incorporates online noise-covariance estimation and multi-threshold triggering mechanisms. This approach significantly reduces state-estimation error by 46.7% and computational load by 41% compared to conventional fixed-rate sampling. Thirdly, temporal asynchrony among edge sensors is resolved by a Dynamic Time Warping (DTW)-based data-fusion module, which employs optimization constrained by Mahalanobis distance. Ultimately, a content-aware deterministic message queue data distribution mechanism is designed to ensure an end-to-end latency of less than 10 ms for critical control commands. This mechanism, which utilizes a “rules first” scheduling strategy and a dynamic resource allocation mechanism, guarantees low latency for key instructions even under the response loads of multiple data messages. The core contribution of this study is the proposal and empirical validation of an architecture co-design methodology aimed at ultra-high-performance industrial systems. This approach moves beyond the conventional paradigm of independently optimizing individual components, and instead prioritizes system-level synergy as the foundation for performance enhancement. Experimental evaluations were conducted under industrial-grade workloads, which involve over 100 heterogeneous data sources. These evaluations reveal that systems designed with this methodology can simultaneously achieve millimeter-level accuracy in field data acquisition and millisecond-level latency in the execution of critical control commands. These results highlight a promising pathway toward the development of real-time intelligent systems capable of meeting the stringent demands of next-generation industrial applications, and demonstrate immediate applicability in smart manufacturing domains. Full article
(This article belongs to the Section Communications)
Show Figures

Figure 1

16 pages, 8827 KB  
Review
Pain Experience in Oncology: A Targeted Literature Review and Development of a Novel Patient-Centric Conceptual Model
by Chloe Carmichael, Sophie Van Tomme, Jordan Miller, Danielle Burns, Cecile Gousset, Helen Kitchen, Harriet Makin, Natalie V. J. Aldhouse and Paul Cordero
Cancers 2025, 17(23), 3760; https://doi.org/10.3390/cancers17233760 - 25 Nov 2025
Viewed by 398
Abstract
Background and objective: Typical endpoints in cancer clinical trials focus on standardized efficacy endpoints, such as overall survival. Pain is not always assessed, although it is a highly prevalent and distressing aspect of patients’ cancer experience and plays a critical role in health-related [...] Read more.
Background and objective: Typical endpoints in cancer clinical trials focus on standardized efficacy endpoints, such as overall survival. Pain is not always assessed, although it is a highly prevalent and distressing aspect of patients’ cancer experience and plays a critical role in health-related quality of life. To inform future pain measurement strategies in oncology, this targeted literature review of pain-related qualitative publications in oncology aimed to characterize and explore the patient experience of pain, and its impact on how patients feel and function. Methods: A review of publications in MEDLINE, Embase and PsycINFO from 2018 to 2023 was conducted. Patient quotes or author descriptions/interpretations were extracted and analyzed with directed content analysis techniques, using ATLAS.ti v9. Data were synthesized to inform the development of a conceptual model. Results: Twenty-eight publications, with data from 534 patients across different oncology indications and geographies, were reviewed. Pain was triggered by disease symptoms and treatment, including surgical procedures, chemotherapy, and radiation. Pain was most often daily, severe, and chronic in nature. Characterizations of pain varied, but most often “sharp”/“stabbing”/“shooting” pain was described across different treatment stages. Pain had an extensive impact on emotional wellbeing, activities of daily living, physical, physiological and social functioning, sleep and work. Unmet needs included difficulty communicating pain needs to healthcare practitioners and fear/distrust of opioid pain medication. Conclusions: This research provides a patient-centric model conceptualizing the patient experience of cancer-related pain. The findings highlight the burden and all-encompassing impact of cancer-related pain, demonstrating the importance of assessing pain in oncology clinical trials. Full article
(This article belongs to the Special Issue Integrating Palliative Care in Oncology)
Show Figures

Figure 1

38 pages, 1419 KB  
Systematic Review
Mapping Digital Solutions for Multi-Scale Built Environment Observation: A Cluster-Based Systematic Review
by Aleksandra Milovanović, Uroš Šošević, Nikola Cvetković, Mladen Pešić, Stefan Janković, Verica Krstić, Jelena Ristić Trajković, Milica Milojević, Ana Nikezić, Dejan Simić and Vladan Djokić
Smart Cities 2025, 8(6), 196; https://doi.org/10.3390/smartcities8060196 - 24 Nov 2025
Viewed by 543
Abstract
This study investigates the intersection of digital tools and methods with the built environment disciplinary framework, focusing on Urban Planning and Development (UPD), Architecture, Engineering, and Construction (AEC), and Cultural Heritage (CH) domains. Using a systematic literature review of 29 solution-oriented documents, the [...] Read more.
This study investigates the intersection of digital tools and methods with the built environment disciplinary framework, focusing on Urban Planning and Development (UPD), Architecture, Engineering, and Construction (AEC), and Cultural Heritage (CH) domains. Using a systematic literature review of 29 solution-oriented documents, the research applies both bibliometric and in-depth content analysis to identify methodological patterns. Co-occurrence mapping revealed four thematic clusters—Data Integration and User-Centric Analysis, Advanced 3D Spatial Analysis and Processing, Real-Time Interaction and Digital Twin Support, and 3D Visualization—each corresponding to distinct stages in a digital workflow, from data acquisition to interactive communication. Comparative and interdependency analyses demonstrated that these clusters operate in a sequential yet interconnected manner, with Data Integration forming the foundation for analysis, simulation, and visualization tasks. While current solutions are robust within individual stages, they remain fragmented, indicating a need for systemic interoperability. The findings underscore the opportunity to develop integrated digital platforms that synthesize these clusters, enabling more comprehensive observation, management, and planning of the built environment. Such integration could strengthen decision-making frameworks, enhance public participation, and advance sustainable, smart city development. Full article
Show Figures

Figure 1

23 pages, 3691 KB  
Article
Development of a Cost-Effective Multiparametric Probe for Continuous Real-Time Monitoring of Aquatic Environments
by Samuel Fernandes, Alice Fialho, José Maria Santos, Teresa Ferreira and Ana Filipa Filipe
Sensors 2025, 25(23), 7110; https://doi.org/10.3390/s25237110 - 21 Nov 2025
Viewed by 592
Abstract
Continuous, real-time measurements are essential for informed water resource management and the development of strategies for the protection of aquatic ecosystems. Traditional methods of water quality assessment often fail to adequately capture seasonal trends, and the frequency and rapidity of fluctuations. To address [...] Read more.
Continuous, real-time measurements are essential for informed water resource management and the development of strategies for the protection of aquatic ecosystems. Traditional methods of water quality assessment often fail to adequately capture seasonal trends, and the frequency and rapidity of fluctuations. To address this challenge, a standalone, low-cost (<EUR 1000), autonomous multisensor prototype for remote assessment was developed. The design of the system was optimized with a hardware-centric approach to minimize costs, whilst providing reliability and high precision and accuracy. Based on embedded systems and capable of long-range communication through GSM/GPRS, the device operates with minimal human intervention, ensuring timely data availability for analysis and decision-making. The multisensor instrument determines four important water quality parameters: pH, conductivity, temperature, and water level. Calibration and sensitivity analyses were performed; 1000 measurements per sensor indicated distributions consistent with normality for pH, conductivity, and water level. The results demonstrated high performance in pH measurements (mean: 5.65 on the Sørensen scale, R2 = 0.9992, expanded uncertainty: ±0.4), conductivity (R2 = 0.9999, expanded uncertainties: ±56.52 to ±3200.00 µS/cm for various standards), and water level (R2 = 0.9952, expanded uncertainty: ±5.2 cm). Capable of providing continuous, accurate data at low cost, this multiparameter probe has broad applicability in environmental regulation compliance, pollution control, and sustainable ecosystem management. Full article
(This article belongs to the Special Issue Sensors for Water Quality Monitoring and Assessment)
Show Figures

Figure 1

31 pages, 4478 KB  
Article
Explainable Artificial Intelligence System for Guiding Companies and Users in Detecting and Fixing Multimedia Web Vulnerabilities on MCS Contexts
by Sergio Alloza-García, Iván García-Magariño and Raquel Lacuesta Gilaberte
Future Internet 2025, 17(11), 524; https://doi.org/10.3390/fi17110524 - 17 Nov 2025
Viewed by 247
Abstract
In the evolving landscape of Mobile Crowdsourcing (MCS), ensuring the security and privacy of both stored and transmitted multimedia content has become increasingly challenging. Factors such as human mobility, device heterogeneity, dynamic topologies, and data diversity exacerbate the complexity of addressing these concerns [...] Read more.
In the evolving landscape of Mobile Crowdsourcing (MCS), ensuring the security and privacy of both stored and transmitted multimedia content has become increasingly challenging. Factors such as human mobility, device heterogeneity, dynamic topologies, and data diversity exacerbate the complexity of addressing these concerns effectively. To tackle these challenges, this paper introduces CSXAI (Crowdsourcing eXplainable Artificial Intelligence)—a novel explainable AI system designed to proactively assess and communicate the security status of multimedia resources downloaded in MCS environments. While CSXAI integrates established attack detection techniques, its primary novelty lies in its synthesis of these methods with a user-centric XAI framework tailored for the specific challenges of MCS frameworks. CSXAI intelligently analyzes potential vulnerabilities and threat scenarios by evaluating website context, attack impact, and user-specific characteristics. The current implementation focuses on the detection and explanation of three major web vulnerability classes: Cross-Site Scripting (XSS), Cross-Site Request Forgery (CSRF), and insecure File Upload. The proposed system not only detects digital threats in advance but also adapts its explanations to suit both technical and non-technical users, thereby enabling informed decision-making before users access potentially harmful content. Furthermore, the system offers actionable security recommendations through clear, tailored explanations, enhancing users’ ability to implement protective measures across diverse devices. The results from real-world testing suggest a notable improvement in users’ ability to understand and mitigate security risks in MCS environments. By combining proactive vulnerability detection with user-adaptive, explainable feedback, the CSXAI framework shows promise in empowering users to enhance their security posture effectively, even with minimal cybersecurity expertise. These findings underscore the potential of CSXAI as a reliable and accessible solution for tackling cybersecurity challenges in dynamic, multimedia-driven ecosystems. Quantitative results showed high user satisfaction and interpretability (SUS = 79.75 ± 6.40; USE subscales = 5.32–5.88). Full article
Show Figures

Figure 1

34 pages, 14464 KB  
Article
Modular IoT Architecture for Monitoring and Control of Office Environments Based on Home Assistant
by Yevheniy Khomenko and Sergii Babichev
IoT 2025, 6(4), 69; https://doi.org/10.3390/iot6040069 - 17 Nov 2025
Viewed by 986
Abstract
Cloud-centric IoT frameworks remain dominant; however, they introduce major challenges related to data privacy, latency, and system resilience. Existing open-source solutions often lack standardized principles for scalable, local-first deployment and do not adequately integrate fault tolerance with hybrid automation logic. This study presents [...] Read more.
Cloud-centric IoT frameworks remain dominant; however, they introduce major challenges related to data privacy, latency, and system resilience. Existing open-source solutions often lack standardized principles for scalable, local-first deployment and do not adequately integrate fault tolerance with hybrid automation logic. This study presents a practical and extensible local-first IoT architecture designed for full operational autonomy using open-source components. The proposed system features a modular, layered design that includes device, communication, data, management, service, security, and presentation layers. It integrates MQTT, Zigbee, REST, and WebSocket protocols to enable reliable publish–subscribe and request–response communication among heterogeneous devices. A hybrid automation model combines rule-based logic with lightweight data-driven routines for context-aware decision-making. The implementation uses Proxmox-based virtualization with Home Assistant as the core automation engine and operates entirely offline, ensuring privacy and continuity without cloud dependency. The architecture was deployed in a real-world office environment and evaluated under workload and fault-injection scenarios. Results demonstrate stable operation with MQTT throughput exceeding 360,000 messages without packet loss, automatic recovery from simulated failures within three minutes, and energy savings of approximately 28% compared to baseline manual control. Compared to established frameworks such as FIWARE and IoT-A, the proposed approach achieves enhanced modularity, local autonomy, and hybrid control capabilities, offering a reproducible model for privacy-sensitive smart environments. Full article
Show Figures

Figure 1

17 pages, 321 KB  
Article
Influence of Marital and Parental Status on Public Reactions to Stuttering in Chile: A Socio-Demographic Study
by Yasna Sandoval, Carlos Rojas, Bárbara Farías, Gabriel Lagos, Ángel Roco-Videla, Arnaldo Carocca and Goncalo Leal
Int. J. Environ. Res. Public Health 2025, 22(11), 1662; https://doi.org/10.3390/ijerph22111662 - 2 Nov 2025
Viewed by 555
Abstract
Stuttering is a communication disorder that significantly impacts individuals’ quality of life. This study examines public reactions towards stuttering within the Latin American context, specifically in Chile, using the Public Opinion Survey of Human Attributes-Stuttering. Data were collected from 400 adults, revealing that [...] Read more.
Stuttering is a communication disorder that significantly impacts individuals’ quality of life. This study examines public reactions towards stuttering within the Latin American context, specifically in Chile, using the Public Opinion Survey of Human Attributes-Stuttering. Data were collected from 400 adults, revealing that married individuals and parents exhibit heightened sensitivity and concern towards stuttering, especially regarding close family members. For instance, 56.86% of married respondents expressed worry about a stuttering sibling, contrasting sharply with only 27.18% of single respondents. Moreover, parents were notably anxious about stuttering in their family. This study underscores the significant role of marital status and parental responsibilities in shaping public attitudes towards stuttering. Additionally, it emphasizes the influence of family-centric values, advocating for the need for comprehensive educational initiatives to combat prevailing stigma towards individuals with stuttering. Full article
20 pages, 1100 KB  
Article
Data Distribution Strategies for Mixed Traffic Flows in Software-Defined Networks: A QoE-Driven Approach
by Hongming Li, Hao Li, Yuqing Ji and Ziwei Wang
Appl. Sci. 2025, 15(21), 11573; https://doi.org/10.3390/app152111573 - 29 Oct 2025
Viewed by 355
Abstract
The rapid proliferation of heterogeneous applications, from latency-critical video delivery to bandwidth-intensive file transfers, poses increasing challenges for modern communication networks. Traditional traffic engineering approaches often fall short in meeting diverse Quality of Experience (QoE) requirements under such conditions. To overcome these limitations, [...] Read more.
The rapid proliferation of heterogeneous applications, from latency-critical video delivery to bandwidth-intensive file transfers, poses increasing challenges for modern communication networks. Traditional traffic engineering approaches often fall short in meeting diverse Quality of Experience (QoE) requirements under such conditions. To overcome these limitations, this study proposes a QoE-driven distribution framework for mixed traffic in Software-Defined Networking (SDN) environments. The framework integrates flow categorization, adaptive path selection, and feedback-based optimization to dynamically allocate resources in alignment with application-level QoE metrics. By prioritizing delay-sensitive flows while ensuring efficient handling of high-volume traffic, the approach achieves balanced performance across heterogeneous service demands. In our 15-RSU Mininet tests under service number = 1 and offered demand = 10 ms, JOGAF attains max end-to-end delays of 415.74 ms, close to the 399.64 ms achieved by DOGA, while reducing the number of active hosts from 5 to 3 compared with DOGA. By contrast, HNOGA exhibits delayed growth of up to 7716.16 ms with 2 working hosts, indicating poorer suitability for latency-sensitive flows. These results indicate that JOGAF achieves near-DOGA latency with substantially lower host activation, offering a practical energy-aware alternative for mixed traffic SDN deployments. Beyond generic communication scenarios, the framework also shows strong potential in Intelligent Transportation Systems (ITS), where SDN-enabled vehicular networks require adaptive, user-centric service quality management. This work highlights the necessity of coupling classical traffic engineering concepts with SDN programmability to address the multifaceted challenges of next-generation networking. Moreover, it establishes a foundation for scalable, adaptive data distribution strategies capable of enhancing user experience while maintaining robustness across dynamic traffic environments. Full article
(This article belongs to the Section Electrical, Electronics and Communications Engineering)
Show Figures

Figure 1

28 pages, 4508 KB  
Article
Mixed Reality-Based Multi-Scenario Visualization and Control in Automated Terminals: A Middleware and Digital Twin Driven Approach
by Yubo Wang, Enyu Zhang, Ang Yang, Keshuang Du and Jing Gao
Buildings 2025, 15(21), 3879; https://doi.org/10.3390/buildings15213879 - 27 Oct 2025
Viewed by 800
Abstract
This study presents a Digital Twin–Mixed Reality (DT–MR) framework for the immersive and interactive supervision of automated container terminals (ACTs), addressing the fragmented data and limited situational awareness of conventional 2D monitoring systems. The framework employs a middleware-centric architecture that integrates heterogeneous [...] Read more.
This study presents a Digital Twin–Mixed Reality (DT–MR) framework for the immersive and interactive supervision of automated container terminals (ACTs), addressing the fragmented data and limited situational awareness of conventional 2D monitoring systems. The framework employs a middleware-centric architecture that integrates heterogeneous subsystems—covering terminal operation, equipment control, and information management—through standardized industrial communication protocols. It ensures synchronized timestamps and delivers semantically aligned, low-latency data streams to a multi-scale Digital Twin developed in Unity. The twin applies level-of-detail modeling, spatial anchoring, and coordinate alignment (from Industry Foundation Classes (IFCs) to east–north–up (ENU) coordinates and Unity space) for accurate registration with physical assets, while a Microsoft HoloLens 2 device provides an intuitive Mixed Reality interface that combines gaze, gesture, and voice commands with built-in safety interlocks for secure human–machine interaction. Quantitative performance benchmarks—latency ≤100 ms, status refresh ≤1 s, and throughput ≥10,000 events/s—were met through targeted engineering and validated using representative scenarios of quay crane alignment and automated guided vehicle (AGV) rerouting, demonstrating improved anomaly detection, reduced decision latency, and enhanced operational resilience. The proposed DT–MR pipeline establishes a reproducible and extensible foundation for real-time, human-in-the-loop supervision across ports, airports, and other large-scale smart infrastructures. Full article
(This article belongs to the Special Issue Digital Technologies, AI and BIM in Construction)
Show Figures

Figure 1

21 pages, 1007 KB  
Article
DD-CC-II: Data Driven Cell–Cell Interaction Inference and Its Application to COVID-19
by Heewon Park and Satoru Miyano
Int. J. Mol. Sci. 2025, 26(20), 10170; https://doi.org/10.3390/ijms262010170 - 19 Oct 2025
Viewed by 511
Abstract
Cell–cell interactions play a pivotal role in maintaining tissue homeostasis and driving disease progression. Conventional Cell–cell interactions modeling approaches depend on ligand–receptor databases, which often fail to capture context-specific or newly emerging signaling mechanisms. To address this limitation, we propose a data-driven computational [...] Read more.
Cell–cell interactions play a pivotal role in maintaining tissue homeostasis and driving disease progression. Conventional Cell–cell interactions modeling approaches depend on ligand–receptor databases, which often fail to capture context-specific or newly emerging signaling mechanisms. To address this limitation, we propose a data-driven computational framework, data-driven cell–cell interaction inference (DD-CC-II), which employs a graph-based model using eigen-cells to represent cell groups. DD-CC-II uses eigen-cells (i.e., functional module within the cell population) to characterize cell groups and construct correlation coefficient networks to model between-group associations. Correlation coefficient networks between eigen-cells are constructed, and their statistical significance is evaluated via over-representation analysis and hypergeometric testing. Monte Carlo simulations demonstrate that DD-CC-II achieves superior performance in inferring CCIs compared with ligand–receptor-based methods. The application to whole-blood RNA-seq data from the Japan COVID-19 Task Force revealed severity stage-specific interaction patterns. Markers such as FOS, CXCL8, and HLA-A were associated with high severity, whereas IL1B, CD3D, and CCL5 were related to low severity. The systemic lupus erythematosus pathway emerged as a potential immune mechanism underlying disease severity. Overall, DD-CC-II provides a data-centric approach for mapping the cellular communication landscape, facilitating a better understanding of disease progression at the intercellular level. Full article
(This article belongs to the Special Issue Advances in Biomathematics, Computational Biology, and Bioengineering)
Show Figures

Figure 1

19 pages, 578 KB  
Article
Exploring the Interplay Between Job Satisfaction and Employee Retention in Romania’s Hospitality Sector: A Comprehensive Analysis
by Ioana C. Patrichi, Tudor M. Edu, Camelia M. Gheorghe, Stefania C. Antonovici and Catrinel R. Dridea
Sustainability 2025, 17(20), 8971; https://doi.org/10.3390/su17208971 - 10 Oct 2025
Viewed by 2782
Abstract
This study investigates the complex interplay between internal communication, psychological well-being, and job satisfaction, as well as their influence on employee retention and job performance in Romania’s post-pandemic hospitality sector. In this study, data were collected from 350 employees across hotels, restaurants, and [...] Read more.
This study investigates the complex interplay between internal communication, psychological well-being, and job satisfaction, as well as their influence on employee retention and job performance in Romania’s post-pandemic hospitality sector. In this study, data were collected from 350 employees across hotels, restaurants, and resorts. A Covariance-Based Structural Equation Modeling (CB-SEM) approach was employed for the analysis. Findings suggest that both internal communication and psychological well-being are significant positive predictors of job satisfaction. In turn, job satisfaction is a powerful driver of both employee retention and job performance. A key finding is that job satisfaction fully mediates the relationship between psychological well-being and job performance, with no direct effect observed between the latter two constructs. These results underscore that fostering an employee-centric environment is crucial for achieving social sustainability, directly supporting global Sustainable Development Goals (SDG 8: Decent Work and Economic Growth and SDG 3: Good Health and Well-being). Theoretical and practical implications, as well as limitations and future research directions, are discussed. Full article
(This article belongs to the Section Tourism, Culture, and Heritage)
Show Figures

Figure 1

36 pages, 2113 KB  
Article
Self-Sovereign Identities and Content Provenance: VeriTrust—A Blockchain-Based Framework for Fake News Detection
by Maruf Farhan, Usman Butt, Rejwan Bin Sulaiman and Mansour Alraja
Future Internet 2025, 17(10), 448; https://doi.org/10.3390/fi17100448 - 30 Sep 2025
Cited by 1 | Viewed by 3036
Abstract
The widespread circulation of digital misinformation exposes a critical shortcoming in prevailing detection strategies, namely, the absence of robust mechanisms to confirm the origin and authenticity of online content. This study addresses this by introducing VeriTrust, a conceptual and provenance-centric framework designed to [...] Read more.
The widespread circulation of digital misinformation exposes a critical shortcoming in prevailing detection strategies, namely, the absence of robust mechanisms to confirm the origin and authenticity of online content. This study addresses this by introducing VeriTrust, a conceptual and provenance-centric framework designed to establish content-level trust by integrating Self-Sovereign Identity (SSI), blockchain-based anchoring, and AI-assisted decentralized verification. The proposed system is designed to operate through three key components: (1) issuing Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) through Hyperledger Aries and Indy; (2) anchoring cryptographic hashes of content metadata to an Ethereum-compatible blockchain using Merkle trees and smart contracts; and (3) enabling a community-led verification model enhanced by federated learning with future extensibility toward zero-knowledge proof techniques. Theoretical projections, derived from established performance benchmarks, suggest the framework offers low latency and high scalability for content anchoring and minimal on-chain transaction fees. It also prioritizes user privacy by ensuring no on-chain exposure of personal data. VeriTrust redefines misinformation mitigation by shifting from reactive content-based classification to proactive provenance-based verification, forming a verifiable link between digital content and its creator. VeriTrust, while currently at the conceptual and theoretical validation stage, holds promise for enhancing transparency, accountability, and resilience against misinformation attacks across journalism, academia, and online platforms. Full article
(This article belongs to the Special Issue AI and Blockchain: Synergies, Challenges, and Innovations)
Show Figures

Figure 1

33 pages, 20640 KB  
Article
A Complex Network Science Perspective on Urban Parcel Locker Placement
by Enrico Corradini, Mattia Mandorlini, Filippo Mariani, Paolo Roselli, Samuele Sacchetti and Matteo Spiga
Big Data Cogn. Comput. 2025, 9(10), 249; https://doi.org/10.3390/bdcc9100249 - 30 Sep 2025
Viewed by 1006
Abstract
The rapid rise of e-commerce is intensifying pressure on last-mile delivery networks, making the strategic placement of parcel lockers an urgent urban challenge. In this work, we adapt multilayer two-mode Social Network Analysis to the parcel-locker siting problem, modeling city-scale systems as bipartite [...] Read more.
The rapid rise of e-commerce is intensifying pressure on last-mile delivery networks, making the strategic placement of parcel lockers an urgent urban challenge. In this work, we adapt multilayer two-mode Social Network Analysis to the parcel-locker siting problem, modeling city-scale systems as bipartite networks linking spatially resolved demand zones to locker locations using only open-source demographic and geographic data. We introduce two new Social Network Analysis metrics, Dual centrality and Coverage centrality, designed to identify both structurally critical and highly accessible lockers within the network. Applying our framework to Milan, Rome, and Naples, we find that conventional coverage-based strategies successfully maximize immediate service reach, but tend to prioritize redundant hubs. In contrast, Dual centrality reveals a distinct set of lockers whose presence is essential for maintaining overall connectivity and resilience, often acting as hidden bridges between user communities. Comparative analysis with state-of-the-art multi-criteria optimization baselines confirms that our network-centric metrics deliver complementary, and in some cases better, guidance for robust locker placement. Our results show that a network-analytic lens yields actionable guidance for resilient last-mile locker siting. The method is reproducible from open data (potential-access weights) and plug-in compatible with observed assignments. Importantly, the path-based results (Coverage centrality) are adjacency-driven and thus largely insensitive to volumetric weights. Full article
Show Figures

Figure 1

16 pages, 1135 KB  
Article
Geomatics Education in the Mining Industry: Assessing Competency Targets and Implementation Challenges
by Artur Krawczyk
Geosciences 2025, 15(10), 374; https://doi.org/10.3390/geosciences15100374 - 30 Sep 2025
Viewed by 927
Abstract
This article discusses the growing role of Geomatics and Geoinformatics in the education of specialists at various universities, with particular emphasis on the mining industry. The study used a qualitative desk research approach that included analysis of institutional curricula, legislative documents, and scientific [...] Read more.
This article discusses the growing role of Geomatics and Geoinformatics in the education of specialists at various universities, with particular emphasis on the mining industry. The study used a qualitative desk research approach that included analysis of institutional curricula, legislative documents, and scientific publications related to spatial data processing, mining education, and the geomatics professional framework. The paper identifies the requirements and educational goals for mining geomaticians who can act as managers of spatial data infrastructure in mining companies, managing the mining modelling processes and its impact on the environment and the local community. The article concludes with recommendations for restructuring educational programs to meet industry expectations for data-centric mining operations. The introduction of the profile of the ’mining geomatician’ can contribute significantly to the development of the industry, making it more modern and adapted to the requirements of a modern knowledge and technology-based economy. Full article
Show Figures

Figure 1

Back to TopTop