Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (96)

Search Parameters:
Keywords = power-divergence measure

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 9497 KiB  
Article
Tapered Quantum Cascade Laser Achieving Low Divergence Angle and High Output Power
by Zizhuo Liu, Hongxiao Li, Jiagang Chen, Anlan Chen, Shan Niu, Changlei Wu, Yongqiang Sun, Xingli Zhong, Hui Su, Hao Xu, Jinchuan Zhang, Jiang Wu and Fengqi Liu
Sensors 2025, 25(15), 4572; https://doi.org/10.3390/s25154572 - 24 Jul 2025
Viewed by 264
Abstract
In this work, we present a high-performance tapered quantum cascade laser (QCL) designed to achieve both high output power and low divergence angle. By integrating a tapered waveguide with a Fabry–Perot structure, significant improvements of tapered QCL devices in both output power and [...] Read more.
In this work, we present a high-performance tapered quantum cascade laser (QCL) designed to achieve both high output power and low divergence angle. By integrating a tapered waveguide with a Fabry–Perot structure, significant improvements of tapered QCL devices in both output power and beam quality are demonstrated. The optimized 50 µm wide tapered QCL achieved a maximum output power of 2.76 W in pulsed operation with a slope efficiency of 3.52 W/A and a wall-plug efficiency (WPE) of 16.2%, while reducing the divergence angle to 13.01°. The device maintained a maximum power of 1.34 W with a WPE exceeding 8.2%, measured under room temperature and continuous wave (CW) operation. Compared to non-tapered Fabry–Perot QCLs, the tapered devices exhibited a nearly 10-fold increase in output power and over 200% improvement in WPE. This work provides a promising pathway for advancing mid-infrared laser technology, particularly for applications requiring high power, low divergence, and temperature stability. Full article
(This article belongs to the Special Issue Recent Trends in Quantum Sensing)
Show Figures

Figure 1

24 pages, 3524 KiB  
Article
Transient Stability Assessment of Power Systems Based on Temporal Feature Selection and LSTM-Transformer Variational Fusion
by Zirui Huang, Zhaobin Du, Jiawei Gao and Guoduan Zhong
Electronics 2025, 14(14), 2780; https://doi.org/10.3390/electronics14142780 - 10 Jul 2025
Viewed by 257
Abstract
To address the challenges brought by the high penetration of renewable energy in power systems, such as multi-scale dynamic interactions, high feature dimensionality, and limited model generalization, this paper proposes a transient stability assessment (TSA) method that combines temporal feature selection with deep [...] Read more.
To address the challenges brought by the high penetration of renewable energy in power systems, such as multi-scale dynamic interactions, high feature dimensionality, and limited model generalization, this paper proposes a transient stability assessment (TSA) method that combines temporal feature selection with deep learning-based modeling. First, a two-stage feature selection strategy is designed using the inter-class Mahalanobis distance and Spearman rank correlation. This helps extract highly discriminative and low-redundancy features from wide-area measurement system (WAMS) time-series data. Then, a parallel LSTM-Transformer architecture is constructed to capture both short-term local fluctuations and long-term global dependencies. A variational inference mechanism based on a Gaussian mixture model (GMM) is introduced to enable dynamic representations fusion and uncertainty modeling. A composite loss function combining improved focal loss and Kullback–Leibler (KL) divergence regularization is designed to enhance model robustness and training stability under complex disturbances. The proposed method is validated on a modified IEEE 39-bus system. Results show that it outperforms existing models in accuracy, robustness, interpretability, and other aspects. This provides an effective solution for TSA in power systems with high renewable energy integration. Full article
(This article belongs to the Special Issue Advanced Energy Systems and Technologies for Urban Sustainability)
Show Figures

Figure 1

16 pages, 662 KiB  
Article
Augmenting Naïve Bayes Classifiers with k-Tree Topology
by Fereshteh R. Dastjerdi and Liming Cai
Mathematics 2025, 13(13), 2185; https://doi.org/10.3390/math13132185 - 4 Jul 2025
Viewed by 274
Abstract
The Bayesian network is a directed, acyclic graphical model that can offer a structured description for probabilistic dependencies among random variables. As powerful tools for classification tasks, Bayesian classifiers often require computing joint probability distributions, which can be computationally intractable due to potential [...] Read more.
The Bayesian network is a directed, acyclic graphical model that can offer a structured description for probabilistic dependencies among random variables. As powerful tools for classification tasks, Bayesian classifiers often require computing joint probability distributions, which can be computationally intractable due to potential full dependencies among feature variables. On the other hand, Naïve Bayes, which presumes zero dependencies among features, trades accuracy for efficiency and often comes with underperformance. As a result, non-zero dependency structures, such as trees, are often used as more feasible probabilistic graph approximations; in particular, Tree Augmented Naïve Bayes (TAN) has been demonstrated to outperform Naïve Bayes and has become a popular choice. For applications where a variable is strongly influenced by multiple other features, TAN has been further extended to the k-dependency Bayesian classifier (KDB), where one feature can depend on up to k other features (for a given k2). In such cases, however, the selection of the k parent features for each variable is often made through heuristic search methods (such as sorting), which do not guarantee an optimal approximation of network topology. In this paper, the novel notion of k-tree Augmented Naïve Bayes (k-TAN) is introduced to augment Naïve Bayesian classifiers with k-tree topology as an approximation of Bayesian networks. It is proved that, under the Kullback–Leibler divergence measurement, k-tree topology approximation of Bayesian classifiers loses the minimum information with the topology of a maximum spanning k-tree, where the edge weights of the graph are mutual information between random variables conditional upon the class label. In addition, while in general finding a maximum spanning k-tree is NP-hard for fixed k2, this work shows that the approximation problem can be solved in time O(nk+1) if the spanning k-tree also desires to retain a given Hamiltonian path in the graph. Therefore, this algorithm can be employed to ensure efficient approximation of Bayesian networks with k-tree augmented Naïve Bayesian classifiers of the guaranteed minimum loss of information. Full article
Show Figures

Figure 1

24 pages, 5959 KiB  
Article
An Information Geometry-Based Track-Before-Detect Algorithm for Range-Azimuth Measurements in Radar Systems
by Jinguo Liu, Hao Wu, Zheng Yang, Xiaoqiang Hua and Yongqiang Cheng
Entropy 2025, 27(6), 637; https://doi.org/10.3390/e27060637 - 14 Jun 2025
Viewed by 520
Abstract
The detection of weak moving targets in heterogeneous clutter backgrounds is a significant challenge in radar systems. In this paper, we propose a track-before-detect (TBD) method based on information geometry (IG) theory applied to range-azimuth measurements, which extends the IG detectors to multi-frame [...] Read more.
The detection of weak moving targets in heterogeneous clutter backgrounds is a significant challenge in radar systems. In this paper, we propose a track-before-detect (TBD) method based on information geometry (IG) theory applied to range-azimuth measurements, which extends the IG detectors to multi-frame detection through inter-frame information integration. The approach capitalizes on the distinctive benefits of the information geometry detection framework in scenarios with strong clutter, while enhancing the integration of information across multiple frames within the TBD approach. Specifically, target and clutter trajectories in multi-frame range-azimuth measurements are modeled on the Hermitian positive definite (HPD) and power spectrum (PS) manifolds. A scoring function based on information geometry, which uses Kullback–Leibler (KL) divergence as a geometric metric, is then devised to assess these motion trajectories. Moreover, this study devises a solution framework employing dynamic programming (DP) with constraints on state transitions, culminating in an integrated merit function. This algorithm identifies target trajectories by maximizing the integrated merit function. Experimental validation using real-recorded sea clutter datasets showcases the effectiveness of the proposed algorithm, yielding a minimum 3 dB enhancement in signal-to-clutter ratio (SCR) compared to traditional approaches. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

37 pages, 776 KiB  
Article
Fractional Inclusion Analysis of Superquadratic Stochastic Processes via Center-Radius Total Order Relation with Applications in Information Theory
by Mohsen Ayyash, Dawood Khan, Saad Ihsan Butt and Youngsoo Seol
Fractal Fract. 2025, 9(6), 375; https://doi.org/10.3390/fractalfract9060375 - 12 Jun 2025
Viewed by 321
Abstract
This study presents, for the first time, a new class of interval-valued superquadratic stochastic processes and examines their core properties through the lens of the center-radius total order relation on intervals. These processes serve as a powerful tool for modeling uncertainty in stochastic [...] Read more.
This study presents, for the first time, a new class of interval-valued superquadratic stochastic processes and examines their core properties through the lens of the center-radius total order relation on intervals. These processes serve as a powerful tool for modeling uncertainty in stochastic systems involving interval-valued data. By utilizing their intrinsic structure, we derive sharpened versions of Jensen-type and Hermite–Hadamard-type inequalities, along with their fractional extensions, within the framework of mean-square stochastic Riemann–Liouville fractional integrals. The theoretical findings are validated through extensive graphical representations and numerical simulations. Moreover, the applicability of the proposed processes is demonstrated in the domain of information theory by constructing novel stochastic divergence measures and Shannon’s entropy grounded in interval calculus. The outcomes of this work lay a solid foundation for further exploration in stochastic analysis, particularly in advancing generalized integral inequalities and formulating new stochastic models under uncertainty. Full article
Show Figures

Figure 1

29 pages, 2693 KiB  
Article
Divergence Measures for Globular T-Spherical Fuzzy Sets with Application in Selecting Solar Energy Systems
by Miin-Shen Yang, Yasir Akhtar and Mehboob Ali
Symmetry 2025, 17(6), 872; https://doi.org/10.3390/sym17060872 - 3 Jun 2025
Viewed by 333
Abstract
Despite advancements in divergence and distance measures across fuzzy set extensions, the development of such measures for Globular T-Spherical Fuzzy Sets (G-TSFSs) remains significantly unexplored. Existing approaches often fall short in capturing the rich semantics and high-dimensional uncertainty that G-TSFSs represent, limiting their [...] Read more.
Despite advancements in divergence and distance measures across fuzzy set extensions, the development of such measures for Globular T-Spherical Fuzzy Sets (G-TSFSs) remains significantly unexplored. Existing approaches often fall short in capturing the rich semantics and high-dimensional uncertainty that G-TSFSs represent, limiting their utility in complex decision environments. This study is motivated by the need to fill this critical gap and advance decision science through more expressive and structurally aligned tools. This paper introduces a suite of novel divergence measures (Div-Ms) specifically formulated for G-TSFSs, a powerful tool for capturing uncertainty in multi-criteria group decision-making (MCGDM) under complex conditions. These Div-Ms serve as the foundation for developing new distance measures (Dis-Ms) and similarity measures (SMs), where both Dis-Ms and SMs are symmetry-based and their essential mathematical properties and supporting theorems are rigorously established. Leveraging these constructs, we propose a robust G-TSF-TOPSIS framework and apply it to a real-world problem, selecting optimal solar energy systems (SESs) for a university context. The model integrates expert evaluations, assuming equal importance due to their pivotal and complementary roles. A sensitivity analysis over the tunable parameter (ranging from 4.0 to 5.0 with an increment of 0.2) confirms the robustness and stability of the decision outcomes, with no changes observed in the final rankings. Comparative analysis with existing models shows superiority and soundness of the proposed methods. These results underscore the practical significance and theoretical soundness of the proposed approach. The study concludes by acknowledging its limitations and suggesting directions for future research, particularly in exploring adaptive expert weighting strategies for broader applicability. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

24 pages, 3707 KiB  
Article
Comparison of a Continuous Forest Inventory to an ALS-Derived Digital Inventory in Washington State
by Thomas Montzka, Steve Scharosch, Michael Huebschmann, Mark V. Corrao, Douglas D. Hardman, Scott W. Rainsford, Alistair M. S. Smith and The Confederated Tribes and Bands of the Yakama Nation
Remote Sens. 2025, 17(10), 1761; https://doi.org/10.3390/rs17101761 - 18 May 2025
Viewed by 523
Abstract
The monitoring and assessment of forest conditions has traditionally relied on continuous forest inventory (CFI) plots, where all plot trees are regularly measured at discrete locations, then plots are grouped as representative samples of forested areas via stand-based inventory expectations. Remote sensing data [...] Read more.
The monitoring and assessment of forest conditions has traditionally relied on continuous forest inventory (CFI) plots, where all plot trees are regularly measured at discrete locations, then plots are grouped as representative samples of forested areas via stand-based inventory expectations. Remote sensing data acquisitions, such as airborne laser scanning (ALS), are becoming more widely applied to operational forestry to derive similar stand-based inventories. Although ALS systems are widely applied to assess forest metrics associated with crowns and canopies, limited studies have compared ALS-derived digital inventories to CFI datasets. In this study, we conducted an analysis of over 1000 CFI plot locations on ~611,000 acres and compared it to a single-tree derived inventory. Inventory metrics from CFI data were forward modeled from 2016 to 2019 using the USDA Forest Service Forest Vegetation Simulator (FVS) to produce estimates of trees per acre (TPA), basal area (BA) per tree or per plot, basal area per acre (BAA), and volume per acre (VPA) and compared to the ALS-derived Digital Inventory® (DI) of 2019. The CFI data provided greater on-plot tree counts, BA, and volume compared to the DI when limited to trees ≥5 inches DBH. On-plot differences were less significant for taller trees and increasingly diverged for shorter trees (<20 feet tall) known to be less detectable by ALS. The CFI volume was found to be 44% higher than the ALS-derived DI suggesting mean volume per acre as derived from plot sampling methods may not provide accurate results when expanded across the landscape given variable forest conditions not captured during sampling. These results provide support that when used together, CFI and DI datasets represent a powerful set of tools within the forest management toolkit. Full article
(This article belongs to the Special Issue Remote Sensing and Lidar Data for Forest Monitoring)
Show Figures

Figure 1

22 pages, 1779 KiB  
Article
Barriers to Building Information Modeling (BIM) Implementation in Late-Adopting EU Countries: The Case of Portugal
by Miguel Pereira Lourenço, Amílcar Arantes and António Aguiar Costa
Buildings 2025, 15(10), 1651; https://doi.org/10.3390/buildings15101651 - 14 May 2025
Viewed by 974
Abstract
Adopting building information modeling (BIM) within the architecture, engineering, and construction (AEC) industry presents an opportunity to tackle persistent challenges, such as chronic productivity deficits and emerging imperatives like sustainability. However, BIM implementation (BIMI) across European Union (EU) countries diverges due to different [...] Read more.
Adopting building information modeling (BIM) within the architecture, engineering, and construction (AEC) industry presents an opportunity to tackle persistent challenges, such as chronic productivity deficits and emerging imperatives like sustainability. However, BIM implementation (BIMI) across European Union (EU) countries diverges due to different contexts and the complexity of BIM. This study aims to identify the main barriers to BIMI and recommend effective mitigation measures in Portugal, a late-adopting EU country. Initially, 28 BIMI barriers were identified through a literature review. Experts in a Delphi survey then selected 15 critical barriers. An interpretive structural modeling (ISM) model was developed with input from a focus group to clarify the hierarchical relationships among barriers, and an impact matrix cross-reference multiplication applied to a classification (MICMAC) analysis was performed to evaluate the barriers’ driving and dependence powers. The resulting main barriers to BIMI include a lack of evaluation mechanisms, ignorance of BIM benefits, a shortage of skilled professionals, limited experience and cooperation, resistance to change, and inadequate top management support. Finally, experts in a second focus group developed mitigation measures to address the main barriers while ensuring the measures affect the entire barrier system. These findings will assist researchers, policymakers, and practitioners in late-adopter EU countries in addressing these barriers effectively. Full article
Show Figures

Figure 1

11 pages, 897 KiB  
Article
Epidemiological and Socioeconomic Disparities in the 1742–1743 Epidemic: A Comparative Analysis of Urban Centers and Indigenous Populations Along the Royal Road
by Jorge Hugo Villafañe
Epidemiologia 2025, 6(2), 25; https://doi.org/10.3390/epidemiologia6020025 - 12 May 2025
Viewed by 505
Abstract
Background/Objectives: Epidemics have historically shaped societies, influencing demographic structures, social organization, and economic stability. The 1742–1743 epidemic had a profound impact on populations along the Royal Road (Camino Real), the main colonial corridor between Buenos Aires and Lima. However, its specific demographic and [...] Read more.
Background/Objectives: Epidemics have historically shaped societies, influencing demographic structures, social organization, and economic stability. The 1742–1743 epidemic had a profound impact on populations along the Royal Road (Camino Real), the main colonial corridor between Buenos Aires and Lima. However, its specific demographic and socio-economic effects remain underexplored. This study aims to examine these impacts of the 1742–1743 epidemic through a comparative analysis of urban centers and Indigenous communities. Methods: A historical–comparative approach was employed, analyzing secondary sources including parish records and colonial administrative documents. This study assessed excess mortality and socio-economic consequences across different population groups and settlement types. Results: Mortality rates increased dramatically—up to twelve times the pre-epidemic average in Cordova (Córdoba) and by 45% in Santa Fe—disproportionately affecting Indigenous and enslaved populations. Urban centers experienced severe economic disruption and slow recovery, whereas Indigenous communities and Jesuit missions demonstrated greater resilience. Their communal strategies and early isolation measures contributed to a faster demographic stabilization. Additionally, the epidemic weakened colonial governance in some areas, altering local power structures. Conclusions: The epidemic of 1742–1743 revealed divergent patterns of vulnerability and resilience. Comparative analysis underscores recurring themes in the epidemic response and recovery, drawing relevant parallels with contemporary crises such as COVID-19. Recognizing these historical patterns of adaptation can inform present and future public health strategies. The terminology “plague” is used based on contemporary sources and not confirmed clinically. Full article
Show Figures

Figure 1

17 pages, 2746 KiB  
Article
Semi-Supervised Class-Incremental Sucker-Rod Pumping Well Operating Condition Recognition Based on Multi-Source Data Distillation
by Weiwei Zhao, Bin Zhou, Yanjiang Wang and Weifeng Liu
Sensors 2025, 25(8), 2372; https://doi.org/10.3390/s25082372 - 9 Apr 2025
Cited by 1 | Viewed by 555
Abstract
The complex and variable operating conditions of sucker-rod pumping wells pose a significant challenge for the timely and accurate identification of oil well operating conditions. Effective deep learning based on measured multi-source data obtained from the sucker-rod pumping well production site offers a [...] Read more.
The complex and variable operating conditions of sucker-rod pumping wells pose a significant challenge for the timely and accurate identification of oil well operating conditions. Effective deep learning based on measured multi-source data obtained from the sucker-rod pumping well production site offers a promising solution to the challenge. However, existing deep learning-based operating condition recognition methods are constrained by several factors: the limitations of traditional operating condition recognition methods based on single-source and multi-source data, the need for large amounts of labeled data for training, and the high robustness requirement for recognizing complex and variable data. Therefore, we propose a semi-supervised class-incremental sucker-rod pumping well operating condition recognition method based on measured multi-source data distillation. Firstly, we select measured ground dynamometer cards and measured electrical power cards as information sources, and construct the graph neural network teacher models for data sources, and dynamically fuse the prediction probability of each teacher model through the Squeeze-and-Excitation attention mechanism. Then, we introduce a multi-source data distillation loss. It uses Kullback-Leibler (KL) divergence to measure the difference between the output logic of the teacher and student models. This helps reduce the forgetting of old operating condition category knowledge during class-incremental learning. Finally, we employ a multi-source semi-supervised graph classification method based on enhanced label propagation, which improves the label propagation method through a logistic regression classifier. This method can deeply explore the potential relationship between labeled and unlabeled samples, so as to further enhance the classification performance. Extensive experimental results show that the proposed method achieves superior recognition performance and enhanced engineering practicality in real-world class-incremental oil extraction production scenarios with complex and variable operating conditions. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

18 pages, 1972 KiB  
Article
A Physics-Guided Parameter Estimation Framework for Cold Spray Additive Manufacturing Simulation
by Md Munim Rayhan, Abderrachid Hamrani, Md Sharif Ahmed Sarker, Arvind Agarwal and Dwayne McDaniel
Coatings 2025, 15(4), 364; https://doi.org/10.3390/coatings15040364 - 21 Mar 2025
Viewed by 576
Abstract
This work presents a physics-guided parameter estimation framework for cold spray additive manufacturing (CSAM), focusing on simulating and validating deposit profiles across diverse process conditions. The proposed model employs a two-zone flow representation: quasi-constant velocity near the nozzle exit followed by an exponentially [...] Read more.
This work presents a physics-guided parameter estimation framework for cold spray additive manufacturing (CSAM), focusing on simulating and validating deposit profiles across diverse process conditions. The proposed model employs a two-zone flow representation: quasi-constant velocity near the nozzle exit followed by an exponentially decaying free jet to capture particle acceleration and impact dynamics. The framework employs a comprehensive approach by numerically integrating drag-dominated particle trajectories to predict deposit formation with high accuracy. This physics-based framework incorporates both operational and geometric parameters to ensure robust prediction capabilities. Operational parameters include spray angle, standoff distance, traverse speed, and powder feed rate, while geometric factors encompass nozzle design characteristics such as exit diameter and divergence angle. Validation is performed using 36 experimentally measured profiles of commercially pure titanium powder. The simulator shows excellent agreement with the experimental data, achieving a global root mean square error (RMSE) of 0.048 mm and a coefficient of determination R2=0.991, improving the mean absolute error by more than 40% relative to a neural network-based approach. Sensitivity analyses reveal that nozzle geometry, feed rate, and critical velocity strongly modulate the amplitude and shape of the deposit. Notably, decreasing the nozzle exit diameter or divergence angle significantly increases local deposition rates, while increasing the standoff distance dampens particle velocities, thereby reducing deposit height. Although the partial differential equation (PDE)-based framework entails a moderate increase in computational time—about 50 s per run, roughly 2.5 times longer than simpler empirical models—this remains practical for most process design and optimization tasks. Beyond its accuracy, the PDE-based simulation framework’s principal advantage lies in its minimal reliance on sampling data. It can readily be adapted to new materials or untested process parameters, making it a powerful predictive tool in cold spray process design. This study underscores the simulator’s potential for guiding parameter selection, improving process reliability and offering deeper physical insights into cold spray deposit formation. Full article
Show Figures

Figure 1

36 pages, 1293 KiB  
Article
Exploring the Validity of Adolescent Responses to a Measure of Psychological Flexibility and Inflexibility
by Caleb D. Farley and Tyler L. Renshaw
Behav. Sci. 2025, 15(2), 197; https://doi.org/10.3390/bs15020197 - 12 Feb 2025
Viewed by 1564
Abstract
Validating measures of psychological flexibility (PF) and psychological inflexibility (PI) has occurred in multiple adult samples, but little research has validated PF and PI measures with adolescents. This manuscript describes two studies exploring the validity of responses to the Multidimensional Psychological Flexibility Inventory [...] Read more.
Validating measures of psychological flexibility (PF) and psychological inflexibility (PI) has occurred in multiple adult samples, but little research has validated PF and PI measures with adolescents. This manuscript describes two studies exploring the validity of responses to the Multidimensional Psychological Flexibility Inventory (MPFI) with two samples of adolescents. The first study used exploratory factor analyses on responses to the MPFI with a sample of 16–17-year-olds (N = 249). The results yielded a reduced and simplified measurement model that consisted of two general factors: one for PF and the other for PI. These exploratory findings were further investigated with confirmatory factor analyses in the second study, with a larger sample of 14–17-year-olds (N = 503). The results from the second study generally confirmed the factor model from the first study. Findings from both studies showed that scores derived from the reduced MPFI measurement model evidenced convergent and divergent validity with a variety of mental health criterion measures. Moreover, findings from the second study showed that PF and PI scores had differential predictive power on different concurrent mental health outcomes. This discussion highlights the implications of measuring PF and PI in adolescents, considers limitations of the present studies, and recommends next steps for research. Full article
(This article belongs to the Section Psychiatric, Emotional and Behavioral Disorders)
Show Figures

Figure 1

22 pages, 6709 KiB  
Article
Photobiomodulation LED Devices for Home Use: Design, Function and Potential: A Pilot Study
by Mark Cronshaw, Steven Parker, Omar Hamadah, Josep Arnabat-Dominguez and Martin Grootveld
Dent. J. 2025, 13(2), 76; https://doi.org/10.3390/dj13020076 - 10 Feb 2025
Cited by 1 | Viewed by 3575
Abstract
Background/Objectives: Many commercial light-emitting diode (LED) devices are available for consumer home usage. The performance characteristics in respect to the dosimetry of many of the devices, currently on direct sale to the public, have not been subject to formal appraisal. In order [...] Read more.
Background/Objectives: Many commercial light-emitting diode (LED) devices are available for consumer home usage. The performance characteristics in respect to the dosimetry of many of the devices, currently on direct sale to the public, have not been subject to formal appraisal. In order to ‘bridge the gap’ between the evidence-based photobiomodulation therapy (PBMT) community and other interested parties, an evaluation is made of a selection of torch type hand-held LED PBMT products currently available for home use. Methods: Five randomly chosen intra-oral and hand-held LED PBMT devices were selected. The optical delivery parameters of the devices were measured, including the beam divergence angle, surface area exposure as well as the output power at the level of the LEDs. The surface and sub-surface temperature changes in porcine tissue samples were assessed under standardised conditions. The manufacturer’s patient instructions were correlated to the measured optical parameters. Calculations were made of irradiance and surface radiant exposure. Consumer satisfaction ratings and feedback data were collated, and a relevant statistical analysis conducted. Results: The results were heterogeneous with a wide range of applied wavelengths, output power and irradiance. Power output stability was variable, and, together with a wide beam divergence angle of 74°, the manufacturer’s directions for dosimetry were found to be inconsistent with an accurate dose delivery. Conclusions: The manufacturer’s proposed dosimetry fails to consider the relevance of the beam divergence angle and optical attenuation in view of the scatter and absorption. Appropriate instructions on how best to gain and optimise an acceptable clinical outcome were inconsistent with an evidence-based approach. Subject to validation by well-planned clinical trials, the concept of home PBMT may open interesting new therapeutic approaches. Full article
(This article belongs to the Special Issue Laser Dentistry: The Current Status and Developments)
Show Figures

Figure 1

38 pages, 487 KiB  
Article
Probability via Expectation Measures
by Peter Harremoës
Entropy 2025, 27(2), 102; https://doi.org/10.3390/e27020102 - 22 Jan 2025
Viewed by 1057
Abstract
Since the seminal work of Kolmogorov, probability theory has been based on measure theory, where the central components are so-called probability measures, defined as measures with total mass equal to 1. In Kolmogorov’s theory, a probability measure is used to model an experiment [...] Read more.
Since the seminal work of Kolmogorov, probability theory has been based on measure theory, where the central components are so-called probability measures, defined as measures with total mass equal to 1. In Kolmogorov’s theory, a probability measure is used to model an experiment with a single outcome that will belong to exactly one out of several disjoint sets. In this paper, we present a different basic model where an experiment results in a multiset, i.e., for each of the disjoint sets we obtain the number of observations in the set. This new framework is consistent with Kolmogorov’s theory, but the theory focuses on expected values rather than probabilities. We present examples from testing goodness-of-fit, Bayesian statistics, and quantum theory, where the shifted focus gives new insight or better performance. We also provide several new theorems that address some problems related to the change in focus. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

27 pages, 5200 KiB  
Article
Assessing the Future ODYSEA Satellite Mission for the Estimation of Ocean Surface Currents, Wind Stress, Energy Fluxes, and the Mechanical Coupling Between the Ocean and the Atmosphere
by Marco Larrañaga, Lionel Renault, Alexander Wineteer, Marcela Contreras, Brian K. Arbic, Mark A. Bourassa and Ernesto Rodriguez
Remote Sens. 2025, 17(2), 302; https://doi.org/10.3390/rs17020302 - 16 Jan 2025
Viewed by 1139
Abstract
Over the past decade, several studies based on coupled ocean–atmosphere simulations have shown that the oceanic surface current feedback to the atmosphere (CFB) leads to a slow-down of the mean oceanic circulation and, overall, to the so-called eddy killing effect, i.e., a sink [...] Read more.
Over the past decade, several studies based on coupled ocean–atmosphere simulations have shown that the oceanic surface current feedback to the atmosphere (CFB) leads to a slow-down of the mean oceanic circulation and, overall, to the so-called eddy killing effect, i.e., a sink of kinetic energy from oceanic eddies to the atmosphere that damps the oceanic mesoscale activity by about 30%, with upscaling effects on large-scale currents. Despite significant improvements in the representation of western boundary currents and mesoscale eddies in numerical models, some discrepancies remain when comparing numerical simulations with satellite observations. These discrepancies include a stronger wind and wind stress response to surface currents and a larger air–sea kinetic energy flux from the ocean to the atmosphere in numerical simulations. However, altimetric gridded products are known to largely underestimate mesoscale activity, and the satellite observations operate at different spatial and temporal resolutions and do not simultaneously measure surface currents and wind stress, leading to large uncertainties in air–sea mechanical energy flux estimates. ODYSEA is a new satellite mission project that aims to simultaneously monitor total surface currents and wind stress with a spatial sampling interval of 5 km and 90% daily global coverage. This study evaluates the potential of ODYSEA to measure surface winds, currents, energy fluxes, and ocean–atmosphere coupling coefficients. To this end, we generated synthetic ODYSEA data from a high-resolution coupled ocean–wave–atmosphere simulation of the Gulf Stream using ODYSIM, the Doppler scatterometer simulator for ODYSEA. Our results indicate that ODYSEA would significantly improve the monitoring of eddy kinetic energy, the kinetic energy cascade, and air–sea kinetic energy flux in the Gulf Stream region. Despite the improvement over the current measurements, the estimates of the coupling coefficients between surface currents and wind stress may still have large uncertainties due to the noise inherent in ODYSEA, and also due to measurement capabilities related to wind stress. This study evidences that halving the measurement noise in surface currents would lead to a more accurate estimation of the surface eddy kinetic energy and wind stress coupling coefficients. Since measurement noise in surface currents strongly depends on the square root of the transmit power of the Doppler scatterometer antenna, noise levels can be reduced by increasing the antenna length. However, exploring other alternatives, such as the use of neural networks, could also be a promising approach. Additionally, the combination of wind stress estimation from ODYSEA with other satellite products and numerical simulations could improve the representation of wind stress in gridded products. Future efforts should focus on the assessment of the potential of ODYSEA in quantifying the production of eddy kinetic energy through horizontal energy fluxes and air–sea energy fluxes related to divergent and rotational motions. Full article
(This article belongs to the Section Ocean Remote Sensing)
Show Figures

Figure 1

Back to TopTop