Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (4)

Search Parameters:
Keywords = multiscale cosine similarity entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
42 pages, 6728 KB  
Article
Positioning Fractal Dimension and Lacunarity in the IBSI Feature Space: Simulation With and Without Wavelets
by Mostafa Zahed and Maryam Skafyan
Radiation 2025, 5(4), 32; https://doi.org/10.3390/radiation5040032 - 3 Nov 2025
Viewed by 653
Abstract
Fractal dimension (Frac) and lacunarity (Lac) are frequently proposed as biomarkers of multiscale image complexity, but their incremental value over standardized radiomics remains uncertain. We position both measures within the Image Biomarker Standardisation Initiative (IBSI) feature space by running a fully reproducible comparison [...] Read more.
Fractal dimension (Frac) and lacunarity (Lac) are frequently proposed as biomarkers of multiscale image complexity, but their incremental value over standardized radiomics remains uncertain. We position both measures within the Image Biomarker Standardisation Initiative (IBSI) feature space by running a fully reproducible comparison in two settings. In a baseline experiment, we analyze N=1000 simulated 64×64 textured ROIs discretized to Ng=64, computing 92 IBSI descriptors together with Frac (box counting) and Lac (gliding box), for 94 features per ROI. In a wavelet-augmented experiment, we analyze N=1000 ROIs and add level-1 wavelet descriptors by recomputing first-order and GLCM features in each sub-band (LL, LH, HL, and HH), contributing 4×(19+19)=152 additional features and yielding 246 features per ROI. Feature similarity is summarized by a consensus score that averages z-scored absolute Pearson and Spearman correlations, distance correlation, maximal information coefficient, and cosine similarity, and is visualized with clustered heatmaps, dendrograms, sparse networks, PCA loadings, and UMAP and t-SNE embeddings. Across both settings a stable two-block organization emerges. Frac co-locates with contrast, difference, and short-run statistics that capture high-frequency variation; when wavelets are included, detail-band terms from LH, HL, and HH join this group. Lac co-locates with measures of large, coherent structure—GLSZM zone size, GLRLM long-run, and high-gray-level emphases—and with GLCM homogeneity and correlation; LL (approximation) wavelet features align with this block. Pairwise associations are modest in the baseline but become very strong with wavelets (for example, Frac versus GLCM difference entropy, which summarizes the randomness of gray-level differences, with |r|0.98; and Lac versus GLCM inverse difference normalized (IDN), a homogeneity measure that weights small intensity differences more heavily, with |r|0.96). The multimetric consensus and geometric embeddings consistently place Frac and Lac in overlapping yet separable neighborhoods, indicating related but non-duplicative information. Practically, Frac and Lac are most useful when multiscale heterogeneity is central and they add a measurable signal beyond strong IBSI baselines (with or without wavelets); otherwise, closely related variance can be absorbed by standard texture families. Full article
(This article belongs to the Section Radiation in Medical Imaging)
Show Figures

Figure 1

19 pages, 3665 KB  
Article
Multivariate Multiscale Cosine Similarity Entropy and Its Application to Examine Circularity Properties in Division Algebras
by Hongjian Xiao, Theerasak Chanwimalueang and Danilo P. Mandic
Entropy 2022, 24(9), 1287; https://doi.org/10.3390/e24091287 - 13 Sep 2022
Cited by 7 | Viewed by 2322
Abstract
The extension of sample entropy methodologies to multivariate signals has received considerable attention, with traditional univariate entropy methods, such as sample entropy (SampEn) and fuzzy entropy (FuzzyEn), introduced to measure the complexity of chaotic systems in terms of irregularity and randomness. The corresponding [...] Read more.
The extension of sample entropy methodologies to multivariate signals has received considerable attention, with traditional univariate entropy methods, such as sample entropy (SampEn) and fuzzy entropy (FuzzyEn), introduced to measure the complexity of chaotic systems in terms of irregularity and randomness. The corresponding multivariate methods, multivariate multiscale sample entropy (MMSE) and multivariate multiscale fuzzy entropy (MMFE), were developed to explore the structural richness within signals at high scales. However, the requirement of high scale limits the selection of embedding dimension and thus, the performance is unavoidably restricted by the trade-off between the data size and the required high scale. More importantly, the scale of interest in different situations is varying, yet little is known about the optimal setting of the scale range in MMSE and MMFE. To this end, we extend the univariate cosine similarity entropy (CSE) method to the multivariate case, and show that the resulting multivariate multiscale cosine similarity entropy (MMCSE) is capable of quantifying structural complexity through the degree of self-correlation within signals. The proposed approach relaxes the prohibitive constraints between the embedding dimension and data length, and aims to quantify the structural complexity based on the degree of self-correlation at low scales. The proposed MMCSE is applied to the examination of the complex and quaternion circularity properties of signals with varying correlation behaviors, and simulations show the MMCSE outperforming the standard methods, MMSE and MMFE. Full article
(This article belongs to the Special Issue Entropy and Its Applications across Disciplines III)
Show Figures

Figure 1

14 pages, 4251 KB  
Article
Hierarchical Cosine Similarity Entropy for Feature Extraction of Ship-Radiated Noise
by Zhe Chen, Yaan Li, Hongtao Liang and Jing Yu
Entropy 2018, 20(6), 425; https://doi.org/10.3390/e20060425 - 1 Jun 2018
Cited by 27 | Viewed by 4519
Abstract
The classification performance of passive sonar can be improved by extracting the features of ship-radiated noise. Traditional feature extraction methods neglect the nonlinear features in ship-radiated noise, such as entropy. The multiscale sample entropy (MSE) algorithm has been widely used for quantifying the [...] Read more.
The classification performance of passive sonar can be improved by extracting the features of ship-radiated noise. Traditional feature extraction methods neglect the nonlinear features in ship-radiated noise, such as entropy. The multiscale sample entropy (MSE) algorithm has been widely used for quantifying the entropy of a signal, but there are still some limitations. To remedy this, the hierarchical cosine similarity entropy (HCSE) is proposed in this paper. Firstly, the hierarchical decomposition is utilized to decompose a time series into some subsequences. Then, the sample entropy (SE) is modified by utilizing Shannon entropy rather than conditional entropy and employing angular distance instead of Chebyshev distance. Finally, the complexity of each subsequence is quantified by the modified SE. Simulation results show that the HCSE method overcomes some limitations in MSE. For example, undefined entropy is not likely to occur in HCSE, and it is more suitable for short time series. Compared with MSE, the experimental results illustrate that the classification accuracy of real ship-radiated noise is significantly improved from 75% to 95.63% by using HCSE. Consequently, the proposed HCSE can be applied in practical applications. Full article
Show Figures

Figure 1

23 pages, 2018 KB  
Article
Cosine Similarity Entropy: Self-Correlation-Based Complexity Analysis of Dynamical Systems
by Theerasak Chanwimalueang and Danilo P. Mandic
Entropy 2017, 19(12), 652; https://doi.org/10.3390/e19120652 - 30 Nov 2017
Cited by 44 | Viewed by 11480
Abstract
The nonparametric Sample Entropy (SE) estimator has become a standard for the quantification of structural complexity of nonstationary time series, even in critical cases of unfavorable noise levels. The SE has proven very successful for signals that exhibit a certain degree of the [...] Read more.
The nonparametric Sample Entropy (SE) estimator has become a standard for the quantification of structural complexity of nonstationary time series, even in critical cases of unfavorable noise levels. The SE has proven very successful for signals that exhibit a certain degree of the underlying structure, but do not obey standard probability distributions, a typical case in real-world scenarios such as with physiological signals. However, the SE estimates structural complexity based on uncertainty rather than on (self) correlation, so that, for reliable estimation, the SE requires long data segments, is sensitive to spikes and erratic peaks in data, and owing to its amplitude dependence it exhibits lack of precision for signals with long-term correlations. To this end, we propose a class of new entropy estimators based on the similarity of embedding vectors, evaluated through the angular distance, the Shannon entropy and the coarse-grained scale. Analysis of the effects of embedding dimension, sample size and tolerance shows that the so introduced Cosine Similarity Entropy (CSE) and the enhanced Multiscale Cosine Similarity Entropy (MCSE) are amplitude-independent and therefore superior to the SE when applied to short time series. Unlike the SE, the CSE is shown to yield valid entropy values over a broad range of embedding dimensions. By evaluating the CSE and the MCSE over a variety of benchmark synthetic signals as well as for real-world data (heart rate variability of three different cardiovascular pathologies), the proposed algorithms are demonstrated to be able to quantify degrees of structural complexity in the context of self-correlation over small to large temporal scales, thus offering physically meaningful interpretations and rigor in the understanding the intrinsic properties of the structural complexity of a system, such as the number of its degrees of freedom. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Show Figures

Figure 1

Back to TopTop