Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (199)

Search Parameters:
Keywords = entropy-cloud model

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 1249 KB  
Article
Autoregressive and Residual Index Convolution Model for Point Cloud Geometry Compression
by Gerald Baulig and Jiun-In Guo
Sensors 2026, 26(4), 1287; https://doi.org/10.3390/s26041287 - 16 Feb 2026
Viewed by 143
Abstract
This study introduces a hybrid point cloud compression method that transfers from octree-nodes to voxel occupancy estimation to find its lower-bound bitrate by using a Binary Arithmetic Range Coder. In previous attempts, we demonstrated that our entropy compression model based on index convolution [...] Read more.
This study introduces a hybrid point cloud compression method that transfers from octree-nodes to voxel occupancy estimation to find its lower-bound bitrate by using a Binary Arithmetic Range Coder. In previous attempts, we demonstrated that our entropy compression model based on index convolution achieves promising performance while maintaining low complexity. However, our previous model lacks an autoregressive approach, which is apparently indispensable to compete with the current state-of-the-art of compression performance. Therefore, we adapt an autoregressive grouping method that iteratively populates, explores, and estimates the occupancy of 1-bit voxel candidates in a more discrete fashion. Furthermore, we refactored our backbone architecture by adding a distiller layer on each convolution, forcing every hidden feature to contribute to the final output. Our proposed model extracts local features using lightweight 1D convolution applied in varied ordering and analyzes causal relationships by optimizing the cross-entropy. This approach efficiently replaces the voxel convolution techniques and attention models used in previous works, providing significant improvements in both time and memory consumption. The effectiveness of our model is demonstrated on three datasets, where it outperforms recent deep learning-based compression models in this field. Full article
24 pages, 1724 KB  
Article
P3CL: Pseudo-Label Confidence-Calibrated Curriculum Learning for Weakly Supervised Urban Airborne Laser Scanning Point Cloud Classification
by Ziwei Luo, Tao Zeng, Jun Jiang, Ziyang Cai, Wanru Wu, Zhong Xie and Yongyang Xu
Remote Sens. 2026, 18(4), 552; https://doi.org/10.3390/rs18040552 - 9 Feb 2026
Viewed by 217
Abstract
Urban airborne laser scanning (ALS) point clouds cover extensive geographical areas, rendering dense point-level annotation economically prohibitive and limiting the feasibility of fully supervised learning. In weakly supervised settings for urban ALS data, the natural long-tailed class distribution—where ground and building points dominate [...] Read more.
Urban airborne laser scanning (ALS) point clouds cover extensive geographical areas, rendering dense point-level annotation economically prohibitive and limiting the feasibility of fully supervised learning. In weakly supervised settings for urban ALS data, the natural long-tailed class distribution—where ground and building points dominate and smaller objects are rare—combined with the use of fixed pseudo-label thresholds under sparse annotations exacerbates confirmation bias and increases prediction uncertainty. This ultimately restricts the effective utilization of unlabeled data during training. To overcome these challenges, we propose a pseudo-label confidence-calibrated curriculum learning framework designed for weakly supervised ALS point cloud classification. The framework introduces a confidence-aware self-adaptive soft gating (CSS) mechanism that dynamically adjusts category-specific thresholds online using exponential moving average statistics and scene-aware normalization, eliminating the need for manual scheduling while improving pseudo-label quality. In addition, a reliability-driven soft selection (RSS) constraint is incorporated, in which each point is assigned a comprehensive reliability score that integrates prediction confidence, entropy clarity, and cross-augmentation consistency, enabling adaptive soft weighting to replace hard pseudo-label selection and achieve more balanced sample utilization. These components are further integrated into a unified pseudo-label confidence-calibrated curriculum learning framework (P3CL) that progressively shifts the model’s focus from high-certainty samples to more ambiguous ones, effectively mitigating confirmation bias. Extensive experiments on three public ALS benchmarks demonstrate that the proposed method consistently outperforms existing weakly supervised approaches and achieves competitive performance compared with several fully supervised models. Full article
Show Figures

Figure 1

19 pages, 1185 KB  
Essay
Risk Assessment of Failure Modes in Cigarette Factory Packaging Systems Based on a Heterogeneous Entropy Weight Method
by Zhuwen Liu, Jing Wang, Xiaoyuan Li and Longfei Yang
Algorithms 2026, 19(2), 135; https://doi.org/10.3390/a19020135 - 8 Feb 2026
Viewed by 215
Abstract
To address the inconsistency in risk prioritization results caused by heterogeneous information and subjective weighting in traditional Failure Mode and Effects Analysis (FMEA), this study proposes a risk priority assessment method based on a heterogeneous entropy weight framework. According to the intrinsic characteristics [...] Read more.
To address the inconsistency in risk prioritization results caused by heterogeneous information and subjective weighting in traditional Failure Mode and Effects Analysis (FMEA), this study proposes a risk priority assessment method based on a heterogeneous entropy weight framework. According to the intrinsic characteristics of different risk factors in cigarette factory packaging systems, crisp numbers, triangular fuzzy numbers, and cloud models are respectively adopted to represent Maintenance Cost, Occurrence frequency, and qualitative risk factors such as Severity and Detection. The entropy weight method is employed to objectively determine the weights of risk factors, and an improved Risk Priority Number (RPN*) is constructed. A case study of a cigarette factory packaging system demonstrates that the proposed method can effectively handle heterogeneous risk information and produce more rational failure mode rankings. Comparative analysis using the Pearson correlation coefficient shows that the proposed method exhibits higher consistency and reliability than traditional RPN and single entropy weight methods. Full article
Show Figures

Figure 1

26 pages, 300 KB  
Review
Theoretical Foundations and Architectural Evolution of Cyberspace Endogenous Security: A Comprehensive Survey
by Heming Zhang, Jian Li, Hong Wang, Shizhong Xu, Hong Yang and Haitao Wu
Appl. Sci. 2026, 16(4), 1689; https://doi.org/10.3390/app16041689 - 8 Feb 2026
Viewed by 236
Abstract
The endogenous security paradigm has emerged to address the limitations of traditional cybersecurity, which relies on reactive “patching” and struggles against unknown threats, APTs, and supply chain attacks. Centered on the principle that “structure determines security”, it diverges from detection-based approaches by employing [...] Read more.
The endogenous security paradigm has emerged to address the limitations of traditional cybersecurity, which relies on reactive “patching” and struggles against unknown threats, APTs, and supply chain attacks. Centered on the principle that “structure determines security”, it diverges from detection-based approaches by employing systems theory and cybernetics to architect closed-loop systems with “heterogeneous execution, multimodal adjudication, and dynamic scheduling”. This is realized through intrinsic architectural constructs such as dynamism, heterogeneity, and redundancy. Theoretically, it transforms deterministic component-level attacks into probabilistic system-level events, thereby shifting the security foundation from a “cognitive contest” to an “entropy-driven confrontation”. This paper provides a comprehensive review of this paradigm. We begin by elucidating its philosophical foundations and core axioms, focusing on the Dynamic Heterogeneous Redundancy (DHR) model, which converts attacks on specific vulnerabilities into probabilistic events under the core assumption of independent heterogeneous execution entities. Next, we trace the architectural evolution from early mimic defense prototypes to a universal framework, analyzing key developments including expanded heterogeneity dimensions, intelligence-driven dynamic policies, and enhanced adjudication mechanisms. We then explore essential enabling technologies and their integration with cutting-edge trends such as artificial intelligence, 6G, and cloud-native computing. Through case studies of the 5G core network and intelligent connected vehicles, the engineering feasibility of the endogenous security paradigm has been validated, with quantifiable security gains demonstrated. In a live-network pilot of the endogenous security micro-segmentation system for the 5G core, resource consumption (CPU/memory usage) of network function virtual machines remained below 3% under steady-state service loads. The system concurrently maintained microsecond-level forwarding performance and achieved carrier-grade core service availability of 99.999%. These results demonstrate that the endogenous security mechanism delivers high-level structural security with an acceptable performance cost. The paper also critically summarizes current theoretical, engineering, and ecosystem challenges, while outlining future research directions such as “Endogenous Security as a Service” and convergence with quantum-safe technologies. Full article
(This article belongs to the Special Issue AI Technology and Security in Cloud/Big Data)
30 pages, 616 KB  
Article
Structural Preservation in Time Series Through Multiscale Topological Features Derived from Persistent Homology
by Luiz Carlos de Jesus, Francisco Fernández-Navarro and Mariano Carbonero-Ruz
Mathematics 2026, 14(3), 538; https://doi.org/10.3390/math14030538 - 2 Feb 2026
Viewed by 293
Abstract
A principled, model-agnostic framework for structural feature extraction in time series is presented, grounded in topological data analysis (TDA). The motivation stems from two gaps identified in the literature: First, compact and interpretable representations that summarise the global geometric organisation of trajectories across [...] Read more.
A principled, model-agnostic framework for structural feature extraction in time series is presented, grounded in topological data analysis (TDA). The motivation stems from two gaps identified in the literature: First, compact and interpretable representations that summarise the global geometric organisation of trajectories across scales remain scarce. Second, a unified, task-agnostic protocol for evaluating structure preservation against established non-topological families is still missing. To address these gaps, time-delay embeddings are employed to reconstruct phase space, sliding windows are used to generate local point clouds, and Vietoris–Rips persistent homology (up to dimension two) is computed. The resulting persistence diagrams are summarised with three transparent descriptors—persistence entropy, maximum persistence amplitude, and feature counts—and concatenated across delays and window sizes to yield a multiscale representation designed to complement temporal and spectral features while remaining computationally tractable. A unified experimental design is specified in which heterogeneous, regularly sampled financial series are preprocessed on native calendars and contrasted with competitive baselines spanning lagged, calendar-driven, difference/change, STL-based, delay-embedding PCA, price-based statistical, signature (FRUITS), and network-derived (NetF) features. Structure preservation is assessed through complementary criteria that probe spectral similarity, variance-scaled reconstruction fidelity, and the conservation of distributional shape (location, scale, asymmetry, tails). The study is positioned as an evaluation of representations, rather than a forecasting benchmark, emphasising interpretability, comparability, and methodological transparency while outlining avenues for adaptive hyperparameter selection and alternative filtrations. Full article
Show Figures

Figure 1

40 pages, 7546 KB  
Article
Hierarchical Soft Actor–Critic Agent with Automatic Entropy, Twin Critics, and Curriculum Learning for the Autonomy of Rock-Breaking Machinery in Mining Comminution Processes
by Guillermo González, John Kern, Claudio Urrea and Luis Donoso
Processes 2026, 14(2), 365; https://doi.org/10.3390/pr14020365 - 20 Jan 2026
Viewed by 362
Abstract
This work presents a hierarchical deep reinforcement learning (DRL) framework based on Soft Actor–Critic (SAC) for the autonomy of rock-breaking machinery in surface mining comminution processes. The proposed approach explicitly integrates mobile navigation and hydraulic manipulation as coupled subprocesses within a unified decision-making [...] Read more.
This work presents a hierarchical deep reinforcement learning (DRL) framework based on Soft Actor–Critic (SAC) for the autonomy of rock-breaking machinery in surface mining comminution processes. The proposed approach explicitly integrates mobile navigation and hydraulic manipulation as coupled subprocesses within a unified decision-making architecture, designed to operate under the unstructured and highly uncertain conditions characteristic of open-pit mining operations. The system employs a hysteresis-based switching mechanism between specialized SAC subagents, incorporating automatic entropy tuning to balance exploration and exploitation, twin critics to mitigate value overestimation, and curriculum learning to manage the progressive complexity of the task. Two coupled subsystems are considered, namely: (i) a tracked mobile machine with a differential drive, whose continuous control enables safe navigation, and (ii) a hydraulic manipulator equipped with an impact hammer, responsible for the fragmentation and dismantling of rock piles through continuous joint torque actuation. Environmental perception is modeled using processed perceptual variables obtained from point clouds generated by an overhead depth camera, complemented with state variables of the machinery. System performance is evaluated in unstructured and uncertain simulated environments using process-oriented metrics, including operational safety, task effectiveness, control smoothness, and energy consumption. The results show that the proposed framework yields robust, stable policies that achieve superior overall process performance compared to equivalent hierarchical configurations and ablation variants, thereby supporting its potential applicability to DRL-based mining automation systems. Full article
(This article belongs to the Special Issue Advances in the Control of Complex Dynamic Systems)
Show Figures

Figure 1

20 pages, 5778 KB  
Article
DTD: Density Triangle Descriptor for 3D LiDAR Loop Closure Detection
by Kaiwei Tang, Qing Wang, Chao Yan, Yang Sun and Shengyi Liu
Sensors 2026, 26(1), 201; https://doi.org/10.3390/s26010201 - 27 Dec 2025
Viewed by 606
Abstract
Loop closure detection is essential for improving the long-term consistency and robustness of simultaneous localization and mapping (SLAM) systems. Existing LiDAR-based loop closure approaches often rely on limited or partial geometric features, restricting their performance in complex environments. To address these limitations, this [...] Read more.
Loop closure detection is essential for improving the long-term consistency and robustness of simultaneous localization and mapping (SLAM) systems. Existing LiDAR-based loop closure approaches often rely on limited or partial geometric features, restricting their performance in complex environments. To address these limitations, this paper introduces a Density Triangle Descriptor (DTD). The proposed method first extracts keypoints from density images generated from LiDAR point clouds, and then constructs a triangle-based global descriptor that is invariant to rotation and translation, enabling robust structural representation. Furthermore, to enhance local discriminative ability, the neighborhood around each keypoint is modeled as a Gaussian distribution, and a local descriptor is derived from the entropy of its probability distribution. During loop closure detection, candidate matches are first retrieved via hash indexing of triangle edge lengths, followed by entropy-based local verification, and are finally refined by singular value decomposition for accurate pose estimation. Extensive experiments on multiple public datasets demonstrate that compared to STD, the proposed DTD improves the average F1 max score and EP by 18.30% and 20.08%, respectively, while achieving a 50.57% improvement in computational efficiency. Moreover, DTD generalizes well to solid-state LiDAR with non-repetitive scanning patterns, validating its robustness and applicability in complex environments. Full article
Show Figures

Figure 1

29 pages, 36160 KB  
Article
Phenological Monitoring and Discrimination of Rice Ecosystems Using Multi-Temporal and Multi-Sensor Polarimetric SAR
by Jean Rochielle F. Mirandilla, Megumi Yamashita and Mitsunori Yoshimura
Remote Sens. 2025, 17(24), 4007; https://doi.org/10.3390/rs17244007 - 11 Dec 2025
Viewed by 647
Abstract
Synthetic Aperture Radar (SAR) has been widely applied for rice monitoring, especially in cloud-prone areas, due to its ability to penetrate clouds. However, only limited methods were developed to monitor separately irrigated rice and rainfed rice ecosystems. This study demonstrated the use of [...] Read more.
Synthetic Aperture Radar (SAR) has been widely applied for rice monitoring, especially in cloud-prone areas, due to its ability to penetrate clouds. However, only limited methods were developed to monitor separately irrigated rice and rainfed rice ecosystems. This study demonstrated the use of multi-temporal polarimetric dual-polarization (dual-pol) SAR (Sentinel-1B and ALOS PALSAR-2) data to monitor and discriminate the irrigated and favorable rainfed rice ecosystems in the province of Iloilo, Philippines. Key polarimetric parameters derived from H–A–α and model-based dual-pol decomposition were analyzed to characterize the rice phenology of both ecosystems. Segmented regression was performed to detect breakpoints corresponding to changes in rice phenology within each ecosystem and used to identify the parameters to use for classification. Based on the results, Sentinel-1B polarimetric parameters (entropy, anisotropy, and alpha) can capture the phenological dynamics, whereas ALOS2 polarimetric parameters were more sensitive to water conditions, as reflected in span and volume scattering. Furthermore, irrigated rice exhibited more stable and predictable scattering patterns than favorable rainfed rice. Using the Random Forest classifier, various combinations of backscatter and polarimetric parameters from Sentinel-1B and ALOS2 were tested to discriminate between the two ecosystems. The highest classification accuracy (81.81% overall accuracy; Kappa = 0.6345) was achieved using the combined backscatter (S1B VH, ALOS2 HH, and HV) and polarimetric parameters from both sensors. The results demonstrated that polarimetric parameters effectively capture phenological stages and associated scattering mechanisms, with the integration of Sentinel-1B and ALOS2 data improving the discrimination of irrigated and favorable rainfed rice systems. Full article
Show Figures

Graphical abstract

26 pages, 8108 KB  
Article
A Multi-Step Grasping Framework for Zero-Shot Object Detection in Everyday Environments Based on Lightweight Foundational General Models
by Ruibo Li, Tie Zhang and Yanbiao Zou
Sensors 2025, 25(23), 7125; https://doi.org/10.3390/s25237125 - 21 Nov 2025
Viewed by 1132
Abstract
Achieving object grasping in everyday environments by leveraging the powerful generalization capabilities of foundational general models while enhancing their deployment efficiency within robotic control systems represents a key challenge for service robots. To address the application environments and hardware resource constraints of household [...] Read more.
Achieving object grasping in everyday environments by leveraging the powerful generalization capabilities of foundational general models while enhancing their deployment efficiency within robotic control systems represents a key challenge for service robots. To address the application environments and hardware resource constraints of household robots, a Three-step Pipeline Grasping Framework (TPGF) is proposed for zero-shot object grasping. The framework operates on the principle of “object perception–object point cloud extraction–grasping pose determination” and requires no training or fine-tuning. We integrate advanced foundational models into the Object Perception Module (OPM) to maximize zero-shot generalization and develop a novel Point Cloud Extraction Method (PCEM) based on Depth Information Suppression (DIS) to enable targeted grasping from complex scenes. Furthermore, to significantly reduce hardware overhead and accelerate deployment, a Saturated Truncation strategy based on relative information entropy is introduced for high-precision quantization, resulting in the highly efficient model, EntQ-EdgeSAM. Experimental results on public datasets demonstrate the superior inspection generalization of the combined foundational models compared to task-specific baselines. The proposed Saturated Truncation strategy achieves 3–21% higher quantization accuracy than symmetric uniform quantization, leading to 3.5% model file compression and 95% faster inference speed for EntQ-EdgeSAM. Grasping experiments confirm that the TPGF achieves robust recognition accuracy and high grasping success rates in zero-shot object grasping tasks within replicated everyday environments, proving its practical value and efficiency for real-world robotic deployment. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

19 pages, 385 KB  
Article
Thermodynamics of Fluid Elements in the Context of Turbulent Isothermal Self-Gravitating Molecular Clouds in Virial Equilibrium
by Sava D. Donkov, Ivan Zhivkov Stefanov and Valentin Kopchev
Universe 2025, 11(12), 383; https://doi.org/10.3390/universe11120383 - 21 Nov 2025
Viewed by 372
Abstract
In this paper, we continue the study of the thermodynamics of fluid elements in isothermal turbulent self-gravitating systems, presented by molecular clouds. We build the model again on the hypothesis that, locally, the turbulent kinetic energy per fluid element can be substituted for [...] Read more.
In this paper, we continue the study of the thermodynamics of fluid elements in isothermal turbulent self-gravitating systems, presented by molecular clouds. We build the model again on the hypothesis that, locally, the turbulent kinetic energy per fluid element can be substituted for the macro-temperature of a gas of fluid elements. Also, we presume that the cloud has a fractal nature. The virial theorem is applicable to our system too (hence it is in a dynamical equilibrium). But, in contrast to the previous work, where the turbulent kinetic energy clearly dominates over the gravity, in the present paper, we assume that the virial relation 2Ekin+Egrav=0 holds for the entire cloud. Hence, the cloud is a dense and strongly self-gravitating object. On that basis, we calculate the internal and the total energy per fluid element. Writing down the first principle of thermodynamics, we obtain the explicit form of the entropy increment. It demonstrates untypical behavior. In the range 0β<0.4, for the turbulent scaling exponent, the entropy increment is positive, but in the interval 0.4<β1, it is negative, and for βcr=0.4, it is zero. The latter two regimes (negative and zero) cannot be explained from the classical point of view. However, we give some arguments for the reasons for these irregularities, and the main is that our cloud is an open self-organizing system driven by the gravity. Moreover, we study the system for critical points under the conditions of three thermodynamic ensembles: micro-canonical, canonical, and grand canonical. Only the canonical ensemble exhibits a critical point, which is a maximum of the free energy and corresponds to an unstable equilibrium of the system. Analysis of the equilibrium potentials also shows that the system resides in unstable states under all the conditions. We explain these results by prompting the hypothesis that the virialized cloud is in the final unstable state before its contraction and subsequent fragmentation or collapse. Full article
Show Figures

Figure 1

53 pages, 5248 KB  
Article
Emission/Reliability-Aware Stochastic Optimization of Electric Bus Parking Lots and Renewable Energy Sources in Distribution Network: A Fuzzy Multi-Objective Framework Considering Forecasted Data
by Masood ur Rehman, Ujwal Ramesh Shirode, Aarti Suryakant Pawar, Tze Jin Wong, Egambergan Khudaynazarov and Saber Arabi Nowdeh
World Electr. Veh. J. 2025, 16(11), 624; https://doi.org/10.3390/wevj16110624 - 17 Nov 2025
Viewed by 585
Abstract
In this paper, an emission- and reliability-aware stochastic optimization model is proposed for the economic planning of electric bus parking lots (EBPLs) with photovoltaic (PV) and wind-turbine (WT) resources in an 85-bus radial distribution network. The model simultaneously minimizes operating, emission, and energy-loss [...] Read more.
In this paper, an emission- and reliability-aware stochastic optimization model is proposed for the economic planning of electric bus parking lots (EBPLs) with photovoltaic (PV) and wind-turbine (WT) resources in an 85-bus radial distribution network. The model simultaneously minimizes operating, emission, and energy-loss costs while increasing system reliability, measured by energy not supplied (ENS), and uses a fuzzy decision-making approach to determine the final solution. To address optimization challenges, a new multi-objective entropy-guided Sinh–Cosh Optimizer (MO-ESCHO) is proposed to efficiently mitigate premature convergence and produce a well-distributed Pareto front. Also, a hybrid forecasting architecture that combines MO-ESCHO and artificial neural networks (ANN) is proposed for accurate prediction of PV and WT power and network loading. The framework is tested across five cases, progressively incorporating EBPL, demand response (DR), forecast information, and stochastic simulation of uncertainties using a new hybrid Unscented Transformation–Cubature Quadrature Rule (UT-CQR) method. Comparative analyses against conventional methods confirm superior performance in achieving better objective values and ensuring computational efficiency. The outcomes indicate that the combination of EBPL with RES reduces operating costs by 5.23%, emission costs by 27.39%, and ENS by 11.48% compared with the base case with RES alone. Moreover, incorporating the stochastic model increases operating costs by 6.03%, emission costs by 5.05%, and ENS by 7.94% over the deterministic forecast case, reflecting the added complexity of uncertainty. The main contributions lie in coupling EBPLs and RES under uncertainty and proposing UT-CQR, which exhibits robust system performance with reduced variance and lower computational effort compared with Monte Carlo and cloud-model approaches. Full article
Show Figures

Figure 1

19 pages, 913 KB  
Article
Decision-Making Model for Risk Assessment in Cloud Computing Using the Enhanced Hierarchical Holographic Modeling
by Auday Qusay Sabri and Halina Binti Mohamed Dahlan
Computers 2025, 14(11), 491; https://doi.org/10.3390/computers14110491 - 13 Nov 2025
Viewed by 626
Abstract
Risk assessment is critical for securing and sustaining operational resilience in cloud computing. Traditional approaches often rely on single-objective or subjective weighting methods, limiting their accuracy and adaptability to dynamic cloud conditions. To address this gap, this study provides a framework for multi-layered [...] Read more.
Risk assessment is critical for securing and sustaining operational resilience in cloud computing. Traditional approaches often rely on single-objective or subjective weighting methods, limiting their accuracy and adaptability to dynamic cloud conditions. To address this gap, this study provides a framework for multi-layered decision-making using an Enhanced Hierarchical Holographic Modeling (EHHM) approach for cloud computing security risk assessment. Two methods were used, the Entropy Weight Method (EWM) and Criteria Importance Through Intercriteria Correlation (CRITIC), to provide a multi-factor decision-making risk assessment framework across the different security domains that exist with cloud computing. Additionally, fuzzy set theory provided the respective levels of complexity dispersion and ambiguities, thus facilitating an accurate and objective participation for a cloud risk assessment across asymmetric information. The trapezoidal membership function measures the correlation, rank, and scores, and was applied to each corresponding cloud risk security domain. The novelty of this re-search is represented by enhancing HHM with an expanded security-transfer domain that encompasses the client side, integrating dual-objective weighting (EWM + CRITIC), and the use of fuzzy logic to quantify asymmetric uncertainty in judgments unique to this study. Informed, data-related, multidimensional cloud risk assessment is not reported in previous studies using HHM. The different Integrated Weight measures allowed for accurate risk judgments. The risk assessment across the calculated cloud computing security domains resulted in a total score of 0.074233, thus supporting the proposed model in identifying and prioritizing risk assessment. Furthermore, the scores of the cloud computing dimensions highlight EHHM as a suitable framework to support and assist corporate decision-making in cloud computing security activity and informed risk awareness with innovative activity amongst a turbulent and dynamic cloud computing environment with corporate operational risk. Full article
(This article belongs to the Special Issue Cloud Computing and Big Data Mining)
Show Figures

Figure 1

27 pages, 6822 KB  
Article
Generalized Variational Retrieval of Full Field-of-View Cloud Fraction and Precipitable Water Vapor from FY-4A/GIIRS Observations
by Gen Wang, Song Ye, Bing Xu, Xiefei Zhi, Qiao Liu, Yang Liu, Yue Pan, Chuanyu Fan, Tiening Zhang and Feng Xie
Remote Sens. 2025, 17(22), 3687; https://doi.org/10.3390/rs17223687 - 11 Nov 2025
Viewed by 777
Abstract
Owing to their high vertical resolution, remote sensing data from meteorological satellite hyperspectral infrared sounders are well-suited for the identification, monitoring, and early warning of high-impact weather events. The effective utilization of full field-of-view (FOV) observations from satellite infrared sounders in high-impact weather [...] Read more.
Owing to their high vertical resolution, remote sensing data from meteorological satellite hyperspectral infrared sounders are well-suited for the identification, monitoring, and early warning of high-impact weather events. The effective utilization of full field-of-view (FOV) observations from satellite infrared sounders in high-impact weather applications remains a major research focus and technical challenge worldwide. This study proposes a generalized variational retrieval framework to estimate full FOV cloud fraction and precipitable water vapor (PWV) from observations of the Geostationary Interferometric Infrared Sounder (GIIRS) onboard the Fengyun-4A (FY-4A) satellite. Based on this method, experiments are performed using high-frequency FY-4A/GIIRS observations during the landfall periods of Typhoon Lekima (2019) and Typhoon Higos (2020). A three-step channel selection strategy based on information entropy is first designed for FY-4A/GIIRS. A constrained generalized variational retrieval method coupled with a cloud cost function is then established. Cloud parameters, including effective cloud fraction and cloud-top pressure, are initially retrieved using the Minimum Residual Method (MRM) and used as initial cloud information. These parameters are iteratively optimized through cost-function minimization, yielding full FOV cloud fields and atmospheric profiles. Full FOV brightness temperature simulations are conducted over cloudy regions to quantitatively evaluate the retrieved cloud fractions, and the derived PWV is further applied to the identification and analysis of hazardous weather events. Experimental results demonstrate that incorporating cloud parameters as auxiliary inputs to the radiative transfer model improves the simulation of FY-4A/GIIRS brightness temperature in cloud-covered areas and reduces brightness temperature biases. Compared with ERA5 Total Column Water Vapour (TCWV) data, the PWV derived from full FOV profiles containing cloud parameter information shows closer agreement and, at certain FOVs, more effectively indicates the occurrence of high-impact weather events. The simplified methodology proposed in this study provides a robust basis for the future assimilation and operational utilization of infrared data over cloud-affected regions in numerical weather prediction models. Full article
Show Figures

Figure 1

18 pages, 2215 KB  
Article
A Dynamic Evaluation Method for Pumped Storage Units Adapting to Asymmetric Evolution of Power System
by Longxiang Chen, Yuan Wang, Hengyu Xue, Lei Deng, Ziwei Zhong, Xuan Jia, Shuo Feng and Jun Xie
Symmetry 2025, 17(11), 1900; https://doi.org/10.3390/sym17111900 - 7 Nov 2025
Viewed by 377
Abstract
As the core component of pumped storage stations (PSS), pumped storage units (PSU) require a scientific and comprehensive evaluation method to guide the selection of optimal units and support the development of the new-type power system (NPS). This paper aims to address the [...] Read more.
As the core component of pumped storage stations (PSS), pumped storage units (PSU) require a scientific and comprehensive evaluation method to guide the selection of optimal units and support the development of the new-type power system (NPS). This paper aims to address the symmetry issues in PSU evaluation methods by proposing an innovative approach based on evolutionary combination weighting and cloud model theory, thereby adapting to the long-term asymmetric evolution of the power system. First, the subjective and objective weights of indicators at all levels for PSU are obtained using the analytic hierarchy process (AHP) and the entropy weight method (EWM). Then, the optimal combination coefficients for subjective and objective weights are determined through game theory, achieving symmetry and balance between the subjective and objective weights. Subsequently, dynamic correction of the indicator weights is realized using a designed evolutionary response function, enabling the weights to evolve dynamically in response to the asymmetric development of the power system. Finally, the cloud model is employed to characterize the randomness and fuzziness of evaluation boundaries, which enhances the adaptability of the evaluation process and the interpretability of results. The simulation results show that, when considering the long-term asymmetric evolution of the power system, the expected score deviations of secondary indicators are approximately 4.7%, 1.3%, 3.5%, and 7.7%, respectively, with an overall score deviation of about 6.4%. The proposed method not only achieves symmetry and balance between subjective and objective factors in traditional evaluation but also accommodates the asymmetric evolution requirements of the power system. Full article
(This article belongs to the Special Issue Symmetry with Power Systems: Control and Optimization)
Show Figures

Figure 1

20 pages, 2482 KB  
Article
Safety Risk Evaluation of Water and Mud Inrush in Karst Tunnel Based on an Improved Weighted Cloud Model
by Baofu Duan, Anni Chu, Liankai Bu, Zhihong Li and Keyan Long
Sustainability 2025, 17(20), 9328; https://doi.org/10.3390/su17209328 - 21 Oct 2025
Viewed by 612
Abstract
Frequent water and mud inrush accidents during karst tunnel construction severely impact tunnel construction safety, environmental sustainability, and the long-term use of infrastructure. Therefore, conducting practical risk assessment for karst tunnel water and mud inrush is crucial for promoting sustainable practices in tunnel [...] Read more.
Frequent water and mud inrush accidents during karst tunnel construction severely impact tunnel construction safety, environmental sustainability, and the long-term use of infrastructure. Therefore, conducting practical risk assessment for karst tunnel water and mud inrush is crucial for promoting sustainable practices in tunnel engineering, as it can mitigate catastrophic events that lead to resource waste, ecological damage, and economic loss. This paper establishes an improved weighted cloud model for karst tunnel water and mud inrush risk to evaluate the associated risk factors. The calculation of subjective weight for risk metrics adopts the ordinal relationship method (G1 method), which is a subjective weighting method improved from the analytic hierarchy process. The calculation of objective weight employs the improved entropy weight method, which is superior to the traditional entropy weight method by effectively preventing calculation distortion. Game theory is applied to calculate the optimal weight combination coefficient for two computational methods, and cloud model theory is finally introduced to reduce the fuzziness of the membership interval during the assessment process. This study applied the established risk assessment model to five sections of the Furong Tunnel and Cushishan Tunnel in Southwest China. The final risk ratings for these sections were determined as “High Risk,” “High Risk,” “Medium Risk,” “High Risk,” and “Moderate Risk”, respectively. These results align with the findings from field investigations, validating the effectiveness and reliability of the cloud model-based mud and water outburst risk assessment using combined weighting. Compared to traditional methods such as fuzzy comprehensive evaluation and entropy weighting, the evaluation results from this study’s model demonstrate higher similarity and reliability. This provides a foundation for assessing mud and water outburst hazards and other tunnel disasters. Full article
Show Figures

Figure 1

Back to TopTop