Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (467)

Search Parameters:
Keywords = data exchange platforms

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 979 KB  
Article
Multi-Modal Semantic Fusion for Smart Contract Vulnerability Detection in Cloud-Based Blockchain Analytics Platforms
by Xingyu Zeng, Qiaoyan Wen and Sujuan Qin
Electronics 2025, 14(21), 4188; https://doi.org/10.3390/electronics14214188 (registering DOI) - 27 Oct 2025
Abstract
With the growth of trusted computing demand for big data analysis, cloud computing platforms are reshaping trusted data infrastructure by integrating Blockchain as a Service (BaaS), which uses elastic resource scheduling and heterogeneous hardware acceleration to support petabyte level multi-institution data security exchange [...] Read more.
With the growth of trusted computing demand for big data analysis, cloud computing platforms are reshaping trusted data infrastructure by integrating Blockchain as a Service (BaaS), which uses elastic resource scheduling and heterogeneous hardware acceleration to support petabyte level multi-institution data security exchange in medical, financial, and other fields. As the core hub of data-intensive scenarios, the BaaS platform has the dual capabilities of privacy computing and process automation. However, its deep dependence on smart contracts generates new code layer vulnerabilities, resulting in malicious contamination of analysis results. The existing detection schemes are limited to the perspective of single-source data, which makes it difficult to capture both global semantic associations and local structural details in a cloud computing environment, leading to a performance bottleneck in terms of scalability and detection accuracy. To address these challenges, this paper proposes a smart contract vulnerability detection method based on multi-modal semantic fusion for the blockchain analysis platform of cloud computing. Firstly, the contract source code is parsed into an abstract syntax tree, and the key code is accurately located based on the predefined vulnerability feature set. Then, the text features and graph structure features of key codes are extracted in parallel to realize the deep fusion of them. Finally, with the help of attention enhancement, the vulnerability probability is output through the fully connected network. The experiments on Ethereum benchmark datasets show that the detection accuracy of our method for re-entrancy vulnerability, timestamp vulnerability, overflow/underflow vulnerability, and delegatecall vulnerability can reach 92.2%, 96.3%, 91.4%, and 89.5%, surpassing previous methods. Additionally, our method has the potential for practical deployment in cloud-based blockchain service environments. Full article
(This article belongs to the Special Issue New Trends in Cloud Computing for Big Data Analytics)
Show Figures

Figure 1

27 pages, 6565 KB  
Article
BLE-Based Custom Devices for Indoor Positioning in Ambient Assisted Living Systems: Design and Prototyping
by David Díaz-Jiménez, José L. López Ruiz, Juan Carlos Cuevas-Martínez, Joaquín Torres-Sospedra, Enrique A. Navarro and Macarena Espinilla Estévez
Sensors 2025, 25(20), 6499; https://doi.org/10.3390/s25206499 - 21 Oct 2025
Viewed by 458
Abstract
This work presents the design and prototyping of two reconfigurable BLE-based devices developed to overcome the limitations of commercial platforms in terms of configurability, data transparency, and energy efficiency. The first is a wearable smart wristband integrating inertial and biometric sensors, while the [...] Read more.
This work presents the design and prototyping of two reconfigurable BLE-based devices developed to overcome the limitations of commercial platforms in terms of configurability, data transparency, and energy efficiency. The first is a wearable smart wristband integrating inertial and biometric sensors, while the second is a configurable beacon (ASIA Beacon) able to dynamically adjust key transmission parameters such as channel selection and power level. Both devices were engineered with energy-aware components, OTA update support, and flexible 3D-printed enclosures optimized for residential environments. The firmware, developed under Zephyr RTOS, exposes data through standardized interfaces (GATT, MQTT), facilitating their integration into IoT architectures and research-oriented testbeds. Initial experiments carried out in an anechoic chamber demonstrated improved RSSI stability, extended autonomy (up to 4 months for beacons and 3 weeks for the wristband), and reliable real-time data exchange. These results highlight the feasibility and potential of the proposed devices for future deployment in ambient assisted living environments, while the focus of this work remains on the hardware and software development process and its validation. Full article
(This article belongs to the Special Issue RF and IoT Sensors: Design, Optimization and Applications)
Show Figures

Figure 1

24 pages, 468 KB  
Article
Mining User Perspectives: Multi Case Study Analysis of Data Quality Characteristics
by Minnu Malieckal and Anjula Gurtoo
Information 2025, 16(10), 920; https://doi.org/10.3390/info16100920 - 21 Oct 2025
Viewed by 203
Abstract
With the growth of digital economies, data quality forms a key factor in enabling use and delivering value. Existing research defines quality through technical benchmarks or provider-led frameworks. Our study shifts the focus to actual users. Thirty-seven distinct data quality dimensions identified through [...] Read more.
With the growth of digital economies, data quality forms a key factor in enabling use and delivering value. Existing research defines quality through technical benchmarks or provider-led frameworks. Our study shifts the focus to actual users. Thirty-seven distinct data quality dimensions identified through a comprehensive review of the literature provide limited applicability for practitioners seeking actionable guidance. To address the gap, in-depth interviews of senior professionals from 25 organizations were conducted, representing sectors like computer science and technology, finance, environmental, social, and governance, and urban infrastructure. Data are analysed using content analysis methodology, with 2 level coding, supported by NVivo R1 software. Several newer perspectives emerged. Firstly, data quality is not simply about accuracy or completeness, rather it depends on suitability for real-world tasks. Secondly, trust grows with data transparency. Knowing where the data comes from and the nature of data processing matters as much as the data per se. Thirdly, users are open to paying for data, provided the data is clean, reliable, and ready to use. These and other results suggest data users focus on a narrower, more practical set of priorities, considered essential in actual workflows. Rethinking quality from a consumer’s perspective offers a practical path to building credible and accessible data ecosystems. This study is particularly useful for data platform designers, policymakers, and organisations aiming to strengthen data quality and trust in data exchange ecosystems. Full article
Show Figures

Graphical abstract

18 pages, 1715 KB  
Article
hiPSCGEM01: A Genome-Scale Metabolic Model for Fibroblast-Derived Human iPSCs
by Anna Procopio, Elvira Immacolata Parrotta, Stefania Scalise, Paolo Zaffino, Rita Granata, Francesco Amato, Giovanni Cuda and Carlo Cosentino
Bioengineering 2025, 12(10), 1128; https://doi.org/10.3390/bioengineering12101128 - 21 Oct 2025
Viewed by 327
Abstract
Human induced pluripotent cells (hiPSCs), generated in vitro, represent a groundbreaking tool for tissue regeneration and repair. Understanding the metabolic intricacies governing hiPSCs is crucial for optimizing their performance across diverse environmental conditions and improving production strategies. To this end, in this work, [...] Read more.
Human induced pluripotent cells (hiPSCs), generated in vitro, represent a groundbreaking tool for tissue regeneration and repair. Understanding the metabolic intricacies governing hiPSCs is crucial for optimizing their performance across diverse environmental conditions and improving production strategies. To this end, in this work, we introduce hiPSCGEM01, the first genome-scale, context-specific metabolic model (GEM) uniquely tailored to fibroblast-derived hiPSCs, marking a clear distinction from existing models of embryonic and cancer stem cells. hiPSCGEM01 was developed using relevant genome expression data carefully selected from the Gene Expression Omnibus (GEO), and integrated with the RECON 3D framework, a comprehensive genome-scale metabolic model of human metabolism. Redundant and unused reactions and genes were identified and removed from the model. Key reactions, including those facilitating the exchange and transport of metabolites between extracellular and intracellular environments, along with all metabolites required to simulate the growth medium, were integrated into hiPSCGEM01. Finally, blocked reactions and dead-end metabolites were identified and adequately solved. Knockout simulations combined with flux balance analysis (FBA) were employed to identify essential genes and metabolites within the metabolic network, providing a comprehensive systems-level view of fibroblast-derived hiPSC metabolism. Notably, the model uncovered the unexpected involvement of nitrate and xenobiotic metabolism—pathways not previously associated with hiPSCs—highlighting potential novel mechanisms of cellular adaptation that merit further investigation. hiPSCGEM01 establishes a robust platform for in silico analysis and the rational optimization of in vitro experiments. Future applications include the evaluation and refinement of culture media, the design of new formulations, and the prediction of hiPSC responses under varying growth conditions, ultimately advancing both experimental and clinical outcomes. Full article
(This article belongs to the Section Cellular and Molecular Bioengineering)
Show Figures

Figure 1

25 pages, 10667 KB  
Article
Adaptive Exposure Optimization for Underwater Optical Camera Communication via Multimodal Feature Learning and Real-to-Sim Channel Emulation
by Jiongnan Lou, Xun Zhang, Haifei Shen, Yiqian Qian, Zhan Wang, Hongda Chen, Zefeng Wang and Lianxin Hu
Sensors 2025, 25(20), 6436; https://doi.org/10.3390/s25206436 - 17 Oct 2025
Viewed by 440
Abstract
Underwater Optical Camera Communication (UOCC) has emerged as a promising paradigm for short-range, high-bandwidth, and secure data exchange in autonomous underwater vehicles (AUVs). UOCC performance strongly depends on exposure time and International Standards Organization (ISO) sensitivity—two parameters that govern photon capture, contrast, and [...] Read more.
Underwater Optical Camera Communication (UOCC) has emerged as a promising paradigm for short-range, high-bandwidth, and secure data exchange in autonomous underwater vehicles (AUVs). UOCC performance strongly depends on exposure time and International Standards Organization (ISO) sensitivity—two parameters that govern photon capture, contrast, and bit detection fidelity. However, optical propagation in aquatic environments is highly susceptible to turbidity, scattering, and illumination variability, which severely degrade image clarity and signal-to-noise ratio (SNR). Conventional systems with fixed imaging settings cannot adapt to time-varying conditions, limiting communication reliability. While validating the feasibility of deep learning for exposure prediction, this baseline lacked environmental awareness and generalization to dynamic scenarios. To overcome these limitations, we introduce a Real-to-Sim-to-Deployment framework that couples a physically calibrated emulation platform with a Hybrid CNN-MLP Model (HCMM). By fusing optical images, environmental states, and camera configurations, the HCMM achieves substantially improved parameter prediction accuracy, reducing RMSE to 0.23–0.33. When deployed on embedded hardware, it enables real-time adaptive reconfiguration and delivers up to 8.5 dB SNR gain, surpassing both static-parameter systems and the prior CNN baseline. These results demonstrate that environment-aware multimodal learning, supported by reproducible optical channel emulation, provides a scalable and robust solution for practical UOCC deployment in positioning, inspection, and laser-based underwater communication. Full article
Show Figures

Figure 1

21 pages, 12126 KB  
Article
Optimization of Synergistic Water Resources, Water Environment, and Water Ecology Remediation and Restoration Project: Application in the Jinshan Lake Basin
by Wenyang Jiang, Xin Liu, Yue Wang, Yue Zhang, Xinxin Chen, Yuxing Sun, Jun Chen and Wanshun Zhang
Water 2025, 17(20), 2986; https://doi.org/10.3390/w17202986 - 16 Oct 2025
Viewed by 271
Abstract
The concept of synergistic water resources, water environment, water ecology remediation, and restoration (3WRR) is essential for addressing the interlinked challenges of water scarcity, pollution, and ecological degradation. An intelligent platform of remediation and restoration project optimization was developed, integrating multi-source data fusion, [...] Read more.
The concept of synergistic water resources, water environment, water ecology remediation, and restoration (3WRR) is essential for addressing the interlinked challenges of water scarcity, pollution, and ecological degradation. An intelligent platform of remediation and restoration project optimization was developed, integrating multi-source data fusion, a coupled air–land–water model, and dynamic decision optimization to support 3WRR in river basins. Applied to the Jinshan Lake Basin (JLB) in China’s Greater Bay Area, the platform assessed 894 scenarios encompassing diverse remediation and restoration plans, including point/non-point source reduction, sediment dredging, recycled water reuse, ecological water replenishment, and sluice gate control, accounting for inter-annual meteorological variability. The results reveal that source control alone (95% reduction in point and non-point loads) leads to limited improvement, achieving less than 2% compliance with Class IV water quality standards in tributaries. Integrated engineering–ecological interventions, combining sediment dredging with high-flow replenishment from the Xizhijiang River (26.1 m3/s), increases compliance days of Class IV water quality standards by 10–51 days. Concerning the lake plans, including sluice regulation and large-volume water exchange, the lake area met the Class IV standard for COD, NH3-N, and TP by over 90%. The platform’s multi-objective optimization framework highlights that coordinated, multi-scale interventions substantially outperform isolated strategies in both effectiveness and sustainability. These findings provide a replicable and data-driven paradigm for 3WRR implementation in complex river–lake systems. The platform’s application and promotion in other watersheds worldwide will serve to enable the low-cost and high-efficiency management of watershed water environments. Full article
(This article belongs to the Section Water Resources Management, Policy and Governance)
Show Figures

Figure 1

43 pages, 6017 KB  
Article
An Efficient Framework for Automated Cyber Threat Intelligence Sharing
by Muhammad Dikko Gambo, Ayaz H. Khan, Ahmad Almulhem and Basem Almadani
Electronics 2025, 14(20), 4045; https://doi.org/10.3390/electronics14204045 - 15 Oct 2025
Viewed by 530
Abstract
As cyberattacks grow increasingly sophisticated, the timely exchange of Cyber Threat Intelligence (CTI) has become essential to enhancing situational awareness and enabling proactive defense. Several challenges exist in CTI sharing, including the timely dissemination of threat information, the need for privacy and confidentiality, [...] Read more.
As cyberattacks grow increasingly sophisticated, the timely exchange of Cyber Threat Intelligence (CTI) has become essential to enhancing situational awareness and enabling proactive defense. Several challenges exist in CTI sharing, including the timely dissemination of threat information, the need for privacy and confidentiality, and the accessibility of data even in unstable network conditions. In addition to security and privacy, latency and throughput are critical performance metrics when selecting a suitable platform for CTI sharing. Substantial efforts have been devoted to developing effective solutions for CTI sharing. Several existing CTI sharing systems adopt either centralized or blockchain-based architectures. However, centralized models suffer from scalability bottlenecks and single points of failure, while the slow and limited transactions of blockchain make it unsuitable for real-time and reliable CTI sharing. To address these challenges, we propose a DDS-based framework that automates data sanitization, STIX-compliant structuring, and real-time dissemination of CTI. Our prototype evaluation demonstrates low latency, linear throughput scaling at configured send rates up to 125 messages per second, with 100% delivery success across all scenarios, while sustaining low CPU and memory overheads. The findings of this study highlight the unique ability of DDS to overcome the timeliness, security, automation, and reliability challenges of CTI sharing. Full article
(This article belongs to the Special Issue New Trends in Cryptography, Authentication and Information Security)
Show Figures

Figure 1

23 pages, 3722 KB  
Article
Automated T-Cell Proliferation in Lab-on-Chip Devices Integrating Microfluidics and Deep Learning-Based Image Analysis for Long-Term Experiments
by María Fernanda Cadena Vizuete, Martin Condor, Dennis Raith, Avani Sapre, Marie Follo, Gina Layedra, Roland Mertelsmann, Maximiliano Perez and Betiana Lerner
Biosensors 2025, 15(10), 693; https://doi.org/10.3390/bios15100693 - 13 Oct 2025
Viewed by 449
Abstract
T cells play a pivotal role in cancer research, particularly in immunotherapy, which harnesses the immune system to target malignancies. However, conventional expansion methods face limitations such as high reagent consumption, contamination risks, and difficulties in maintaining suspension cells in dynamic culture environments. [...] Read more.
T cells play a pivotal role in cancer research, particularly in immunotherapy, which harnesses the immune system to target malignancies. However, conventional expansion methods face limitations such as high reagent consumption, contamination risks, and difficulties in maintaining suspension cells in dynamic culture environments. This study presents a microfluidic system for long-term culture of non-adherent cells, featuring automated perfusion and image acquisition. The system integrates deep learning-based image analysis, which quantifies cell coverage and estimates cell numbers, and efficiently processes large volumes of data. The performance of this deep learning approach was benchmarked against the widely used Trainable Weka Segmentation (TWS) plugin for Fiji. Additionally, two distinct lab-on-a-chip (LOC) devices were evaluated independently: the commercial ibidi® LOC and a custom-made PDMS LOC. The setup supported the proliferation of Jurkat cells and primary human T cells without significant loss during medium exchange. Each platform proved suitable for long-term expansion while offering distinct advantages in terms of design, cell seeding and recovery, and reusability. This integrated approach enables extended experiments with minimal manual intervention, stable perfusion, and supports multi-reagent administration, offering a powerful platform for advancing suspension cell research in immunotherapy. Full article
Show Figures

Figure 1

20 pages, 1956 KB  
Review
Interoperability as a Catalyst for Digital Health and Therapeutics: A Scoping Review of Emerging Technologies and Standards (2015–2025)
by Kola Adegoke, Abimbola Adegoke, Deborah Dawodu, Akorede Adekoya, Ayoola Bayowa, Temitope Kayode and Mallika Singh
Int. J. Environ. Res. Public Health 2025, 22(10), 1535; https://doi.org/10.3390/ijerph22101535 - 8 Oct 2025
Viewed by 900
Abstract
Background: Interoperability is fundamental for advancing digital health and digital therapeutics, particularly with the integration of technologies such as artificial intelligence (AI), blockchain, and federated learning. Low- and middle-income countries (LMICs), where digital infrastructure remains fragmented, face specific challenges in implementing standardized and [...] Read more.
Background: Interoperability is fundamental for advancing digital health and digital therapeutics, particularly with the integration of technologies such as artificial intelligence (AI), blockchain, and federated learning. Low- and middle-income countries (LMICs), where digital infrastructure remains fragmented, face specific challenges in implementing standardized and scalable systems. Methods: This scoping review was conducted using the Arksey and O’Malley framework, refined by Levac et al., and the Joanna Briggs Institute guidelines. Five databases (PubMed, Scopus, IEEE Xplore, ACM Digital Library, and Google Scholar) were searched for peer-reviewed English language studies published between 2015 and 2025. We identified 255 potentially eligible articles and selected a 10% random sample (n = 26) using Stata 18 by StataCorp LLC, College Station, TX, USA, for in-depth data charting and thematic synthesis. Results: The selected studies spanned over 15 countries and addressed priority technologies, including mobile health (mHealth), the use of Health Level Seven (HL7)’s Fast Healthcare Interoperability Resources (FHIR) for data exchange, and blockchain. Interoperability enablers include standards (e.g., HL7 FHIR), data governance frameworks, and policy interventions. Low- and Middle-Income Countries (LMICs) face common issues related to digital capacity shortages, legacy systems, and governance fragmentation. Five thematic areas were identified: (1) policy and governance; (2) standards-based integration; (3) infrastructure and platforms; (4) emerging technologies; and (5) LMIC implementation issues. Conclusions: Emerging digital health technologies increasingly rely on interoperability standards to scale their operation. Although global standards such as FHIR and the Trusted Exchange Framework and Common Agreement (TEFCA) are gaining momentum, LMICs require dedicated governance, infrastructure, and capacity investments to make equitable use feasible. Future initiatives can benefit from using science- and equity-informed frameworks. Full article
Show Figures

Figure 1

58 pages, 4299 KB  
Article
Optimisation of Cryptocurrency Trading Using the Fractal Market Hypothesis with Symbolic Regression
by Jonathan Blackledge and Anton Blackledge
Commodities 2025, 4(4), 22; https://doi.org/10.3390/commodities4040022 - 3 Oct 2025
Viewed by 950
Abstract
Cryptocurrencies such as Bitcoin can be classified as commodities under the Commodity Exchange Act (CEA), giving the Commodity Futures Trading Commission (CFTC) jurisdiction over those cryptocurrencies deemed commodities, particularly in the context of futures trading. This paper presents a method for predicting both [...] Read more.
Cryptocurrencies such as Bitcoin can be classified as commodities under the Commodity Exchange Act (CEA), giving the Commodity Futures Trading Commission (CFTC) jurisdiction over those cryptocurrencies deemed commodities, particularly in the context of futures trading. This paper presents a method for predicting both long- and short-term trends in selected cryptocurrencies based on the Fractal Market Hypothesis (FMH). The FMH applies the self-affine properties of fractal stochastic fields to model financial time series. After introducing the underlying theory and mathematical framework, a fundamental analysis of Bitcoin and Ethereum exchange rates against the U.S. dollar is conducted. The analysis focuses on changes in the polarity of the ‘Beta-to-Volatility’ and ‘Lyapunov-to-Volatility’ ratios as indicators of impending shifts in Bitcoin/Ethereum price trends. These signals are used to recommend long, short, or hold trading positions, with corresponding algorithms (implemented in Matlab R2023b) developed and back-tested. An optimisation of these algorithms identifies ideal parameter ranges that maximise both accuracy and profitability, thereby ensuring high confidence in the predictions. The resulting trading strategy provides actionable guidance for cryptocurrency investment and quantifies the likelihood of bull or bear market dominance. Under stable market conditions, machine learning (using the ‘TuringBot’ platform) is shown to produce reliable short-horizon estimates of future price movements and fluctuations. This reduces trading delays caused by data filtering and increases returns by identifying optimal positions within rapid ‘micro-trends’ that would otherwise remain undetected—yielding gains of up to approximately 10%. Empirical results confirm that Bitcoin and Ethereum exchanges behave as self-affine (fractal) stochastic fields with Lévy distributions, exhibiting a Hurst exponent of roughly 0.32, a fractal dimension of about 1.68, and a Lévy index near 1.22. These findings demonstrate that the Fractal Market Hypothesis and its associated indices provide a robust market model capable of generating investment returns that consistently outperform standard Buy-and-Hold strategies. Full article
Show Figures

Figure 1

39 pages, 5203 KB  
Technical Note
EMR-Chain: Decentralized Electronic Medical Record Exchange System
by Ching-Hsi Tseng, Yu-Heng Hsieh, Heng-Yi Lin and Shyan-Ming Yuan
Technologies 2025, 13(10), 446; https://doi.org/10.3390/technologies13100446 - 1 Oct 2025
Viewed by 500
Abstract
Current systems for exchanging medical records struggle with efficiency and privacy issues. While establishing the Electronic Medical Record Exchange Center (EEC) in 2012 was intended to alleviate these issues, its centralized structure has brought about new attack vectors, such as performance bottlenecks, single [...] Read more.
Current systems for exchanging medical records struggle with efficiency and privacy issues. While establishing the Electronic Medical Record Exchange Center (EEC) in 2012 was intended to alleviate these issues, its centralized structure has brought about new attack vectors, such as performance bottlenecks, single points of failure, and an absence of patient consent over their data. Methods: This paper describes a novel EMR Gateway system that uses blockchain technology to exchange electronic medical records electronically, overcome the limitations of current centralized systems for sharing EMR, and leverage decentralization to enhance resilience, data privacy, and patient autonomy. Our proposed system is built on two interconnected blockchains: a Decentralized Identity Blockchain (DID-Chain) based on Ethereum for managing user identities via smart contracts, and an Electronic Medical Record Blockchain (EMR-Chain) implemented on Hyperledger Fabric to handle medical record indexes and fine-grained access control. To address the dual requirements of cross-platform data exchange and patient privacy, the system was developed based on the Fast Healthcare Interoperability Resources (FHIR) standard, incorporating stringent de-identification protocols. Our system is built using the FHIR standard. Think of it as a common language that lets different healthcare systems talk to each other without confusion. Plus, we are very serious about patient privacy and remove all personal details from the data to keep it confidential. When we tested its performance, the system handled things well. It can take in about 40 transactions every second and pull out data faster, at around 49 per second. To give you some perspective, this is far more than what the average hospital in Taiwan dealt with back in 2018. This shows our system is very solid and more than ready to handle even bigger workloads in the future. Full article
Show Figures

Figure 1

19 pages, 2933 KB  
Article
Experimental Study on Wettability Characteristics of Falling Film Flow Outside Multi-Row Horizontal Tubes
by Zhenchuan Wang and Meijun Li
Processes 2025, 13(10), 3119; https://doi.org/10.3390/pr13103119 - 29 Sep 2025
Viewed by 351
Abstract
The wettability of falling film flow outside multi-row horizontal tubes is a core factor determining the heat and mass transfer performance of falling film heat exchangers, which is critical for their optimized design and stable operation. A visualization experimental platform for falling film [...] Read more.
The wettability of falling film flow outside multi-row horizontal tubes is a core factor determining the heat and mass transfer performance of falling film heat exchangers, which is critical for their optimized design and stable operation. A visualization experimental platform for falling film flow over ten rows of horizontal tubes was constructed, with water as the working fluid. High-definition imaging technology and image processing methods were employed to systematically investigate the liquid film distribution and wettability under three tube diameters (d = 0.016, 0.019, 0.025 m), four tube spacings (s = 0.75d, 1d, 1.25d, 1.5d), and four inter-tube flow patterns (droplet, columnar, column-sheet, and sheet flow). Two parameters, namely the “total wetting length” and the “total wetting area”, were proposed and defined. The distribution characteristics of the wetting ratio for each row of tubes were analyzed, along with the variation laws of the total wetting area of the ten rows of tubes with respect to tube diameter, tube spacing, and liquid film Reynolds number (Rel). The following results were indicated: (1) Increasing the fluid flow rate and the tube spacing both promote the growth of the wetting length. When Rel ≤ 505, with the increase of tube diameter, the percentage of the wetting length of the tenth tube row relative to that of the first tube row decreases under the same fluid flow rate; when Rel > 505, this percentage first decreases and then increases. (2) The total wetting area exhibits a trend of “first increasing then decreasing” or “continuous increasing” with the tube spacing, and the optimal tube spacing varies by flow pattern: s/d = 1 for droplet flow (d ≤ 0.016 m), s/d = 1.25 for columnar flow, and s/d = 1.25 (0.016 m), 1 (0.019 m), 1.5 (0.025 m) for sheet flow. (3) The effect of tube diameter on the total wetting area is a balance between the inhibitory effect (reduced inter-tube fluid dynamic potential energy) and promotional effect (thinner liquid film spreading). The optimal tube diameter is 0.016 m for droplet flow and 0.025 m for columnar/sheet flow (at s/d = 1.25). (4) The wetting performance follows the order 0.016 m > 0.025 m > 0.019 m when Rel > 505, and 0.025 m > 0.019 m > 0.016 m when Rel ≤ 505. Finally, an experimental correlation formula for the wetting ratio considering the Rel, the tube diameter, and tube spacing was fitted. Comparisons with the present experimental data, the literature simulation results, and the literature experimental data showed average errors of ≤10%, ≤8%, and ≤14%, respectively, indicating high prediction accuracy. This study provides quantitative data and theoretical support for the structural optimization and operation control of multi-row horizontal tube falling film heat exchangers. Full article
(This article belongs to the Section Energy Systems)
Show Figures

Figure 1

13 pages, 3426 KB  
Article
Loss Separation Modeling and Optimization of Permalloy Sheets for Low-Noise Magnetic Shielding Devices
by Yuzheng Ma, Minxia Shi, Yachao Zhang, Teng Li, Yusen Li, Leran Zhang and Shuai Yuan
Materials 2025, 18(19), 4527; https://doi.org/10.3390/ma18194527 - 29 Sep 2025
Viewed by 374
Abstract
With the breakthroughs in quantum theory and the rapid advancement of quantum precision measurement sensor technologies, atomic magnetometers based on the spin-exchange relaxation-free (SERF) mechanism have played an increasingly important role in ultra-weak biomagnetic field detection, inertial navigation, and fundamental physics research. To [...] Read more.
With the breakthroughs in quantum theory and the rapid advancement of quantum precision measurement sensor technologies, atomic magnetometers based on the spin-exchange relaxation-free (SERF) mechanism have played an increasingly important role in ultra-weak biomagnetic field detection, inertial navigation, and fundamental physics research. To achieve high-precision measurements, SERF magnetometers must operate in an extremely weak magnetic field environment, while the detection of ultra-weak magnetic signals relies on a low-noise background. Therefore, accurate measurement, modeling, and analysis of magnetic noise in shielding materials are of critical importance. In this study, the magnetic noise of permalloy sheets was modeled, separated, and analyzed based on their measured magnetic properties, providing essential theoretical and experimental support for magnetic noise evaluation in shielding devices. First, a single-sheet tester (SST) was modeled via finite element analysis to investigate magnetization uniformity, and its structure was optimized by adding a supporting connection plate. Second, an experimental platform was established to verify magnetization uniformity and to perform accurate low-frequency measurements of hysteresis loops under different frequencies and field amplitudes while ensuring measurement precision. Finally, the Bertotti loss separation method combined with a PSO optimization algorithm was employed to accurately fit and analyze the three types of losses, thereby enabling precise separation and calculation of hysteresis loss. This provides essential theoretical foundations and primary data for magnetic noise evaluation in shielding devices. Full article
Show Figures

Figure 1

24 pages, 11488 KB  
Article
An Innovative Approach for Forecasting Hydroelectricity Generation by Benchmarking Tree-Based Machine Learning Models
by Bektaş Aykut Atalay and Kasım Zor
Appl. Sci. 2025, 15(19), 10514; https://doi.org/10.3390/app151910514 - 28 Sep 2025
Viewed by 453
Abstract
Hydroelectricity, one of the oldest and most potent forms of renewable energy, not only provides low-cost electricity for the grid but also preserves nature through flood control and irrigation support. Forecasting hydroelectricity generation is vital for utilizing alleviating resources effectively, optimizing energy production, [...] Read more.
Hydroelectricity, one of the oldest and most potent forms of renewable energy, not only provides low-cost electricity for the grid but also preserves nature through flood control and irrigation support. Forecasting hydroelectricity generation is vital for utilizing alleviating resources effectively, optimizing energy production, and ensuring sustainability. This paper provides an innovative approach to hydroelectricity generation forecasting (HGF) of a 138 MW hydroelectric power plant (HPP) in the Eastern Mediterranean by taking electricity productions from the remaining upstream HPPs on the Ceyhan River within the same basin into account, unlike prior research focusing on individual HPPs. In light of tuning hyperparameters such as number of trees and learning rates, this paper presents a thorough benchmark of the state-of-the-art tree-based machine learning models, namely categorical boosting (CatBoost), extreme gradient boosting (XGBoost), and light gradient boosting machines (LightGBM). The comprehensive data set includes historical hydroelectricity generation, meteorological conditions, market pricing, and calendar variables acquired from the transparency platform of the Energy Exchange Istanbul (EXIST) and MERRA-2 reanalysis of the NASA with hourly resolution. Although all three models demonstrated successful performances, LightGBM emerged as the most accurate and efficient model by outperforming the others with the highest coefficient of determination (R2) (97.07%), the lowest root mean squared scaled error (RMSSE) (0.1217), and the shortest computational time (1.24 s). Consequently, it is considered that the proposed methodology demonstrates significant potential for advancing the HGF and will contribute to the operation of existing HPPs and the improvement of power dispatch planning. Full article
Show Figures

Figure 1

22 pages, 2003 KB  
Article
Beyond Opacity: Distributed Ledger Technology as a Catalyst for Carbon Credit Market Integrity
by Stanton Heister, Felix Kin Peng Hui, David Ian Wilson and Yaakov Anker
Computers 2025, 14(9), 403; https://doi.org/10.3390/computers14090403 - 22 Sep 2025
Viewed by 642
Abstract
The 2015 Paris Agreement paved the way for the carbon trade economy, which has since evolved but has not attained a substantial magnitude. While carbon credit exchange is a critical mechanism for achieving global climate targets, it faces persistent challenges related to transparency, [...] Read more.
The 2015 Paris Agreement paved the way for the carbon trade economy, which has since evolved but has not attained a substantial magnitude. While carbon credit exchange is a critical mechanism for achieving global climate targets, it faces persistent challenges related to transparency, double-counting, and verification. This paper examines how Distributed Ledger Technology (DLT) can address these limitations by providing immutable transaction records, automated verification through digitally encoded smart contracts, and increased market efficiency. To assess DLT’s strategic potential for leveraging the carbon markets and, more explicitly, whether its implementation can reduce transaction costs and enhance market integrity, three alternative approaches that apply DLT for carbon trading were taken as case studies. By comparing key elements in these DLT-based carbon credit platforms, it is elucidated that these proposed frameworks may be developed for a scalable global platform. The integration of existing compliance markets in the EU (case study 1), Australia (case study 2), and China (case study 3) can act as a standard for a global carbon trade establishment. The findings from these case studies suggest that while DLT offers a promising path toward more sustainable carbon markets, regulatory harmonization, standardization, and data transfer across platforms remain significant challenges. Full article
Show Figures

Figure 1

Back to TopTop