Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (248)

Search Parameters:
Keywords = phantom energy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 15481 KB  
Article
Evaluation of Scatter Correction Methods in SPECT Images: A Phantom-Based Study of TEW and ESSE Methods
by Ryutaro Mori, Koichi Okuda, Tomoya Okamoto, Yoshihisa Niioka, Kazuya Tsushima, Masakatsu Tsurugaya, Shota Hosokawa and Yasuyuki Takahashi
Radiation 2026, 6(1), 1; https://doi.org/10.3390/radiation6010001 - 7 Jan 2026
Viewed by 184
Abstract
We compared scatter correction (SC) in single-photon emission computed tomography (SPECT) images using effective scatter source estimation (ESSE) and the triple-energy window (TEW) method. We acquired 99mTc and 123I images of brain, myocardial, and performance phantoms containing rods with different [...] Read more.
We compared scatter correction (SC) in single-photon emission computed tomography (SPECT) images using effective scatter source estimation (ESSE) and the triple-energy window (TEW) method. We acquired 99mTc and 123I images of brain, myocardial, and performance phantoms containing rods with different diameters. We assessed contrast ratios (CRs) and ROI-based noise metrics (coefficient of variation, signal-to-noise ratio, and contrast-to-noise ratio [CNR] ). Under 99mTc, ESSE yielded higher CRs than TEW across all phantoms (mean difference 0.04, range 0.01–0.05) and produced the highest CNR in the myocardial phantom, improving the conspicuousness of the simulated defect. Under 123I, CR differences between ESSE and TEW were small and inconsistent (performance phantom: −0.04 to 0.06; brain phantom: −0.01 to 0.00). A Monte Carlo simulation (point source in air) showed substantial photopeak window penetration for cardiac high-resolution collimators (40.0%) but low penetration for medium-energy general-purpose collimators (5.1%), supporting photopeak contamination as a contributor to the 123I findings and potentially attenuating the apparent advantage of model-based SC that does not explicitly account for penetration photons. These findings suggest that SC selection should consider the radionuclide and imaging target and that ESSE might be a reasonable option for 99mTc myocardial imaging under the settings examined. Full article
Show Figures

Figure 1

18 pages, 10663 KB  
Article
Assessment of Image Quality Performance of a Photon-Counting Computed Tomography Scanner Approved for Whole-Body Clinical Applications
by Francesca Saveria Maddaloni, Antonio Sarno, Alessandro Loria, Anna Piai, Cristina Lenardi, Antonio Esposito and Antonella del Vecchio
Sensors 2025, 25(23), 7338; https://doi.org/10.3390/s25237338 - 2 Dec 2025
Cited by 1 | Viewed by 768
Abstract
Background: Photon-counting computed tomography (PCCT) represents a major technological advance in clinical CT imaging, offering superior spatial resolution, enhanced material discrimination, and potential radiation dose reduction compared to conventional energy-integrating detector systems. As the first clinically approved PCCT scanner becomes available, establishing a [...] Read more.
Background: Photon-counting computed tomography (PCCT) represents a major technological advance in clinical CT imaging, offering superior spatial resolution, enhanced material discrimination, and potential radiation dose reduction compared to conventional energy-integrating detector systems. As the first clinically approved PCCT scanner becomes available, establishing a comprehensive characterization of its image quality is essential to understand its performance and clinical impact. Methods: Image quality was evaluated using a commercial quality assurance phantom with acquisition protocols typically used for three anatomical regions—head, abdomen/thorax, and inner ear—representing diverse clinical scenarios. Each region was scanned using both ultra-high-resolution (UHR, 120 × 0.2 mm slices) and conventional (144 × 0.4 mm slices) protocols. Conventional metrics, including signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), slice thickness accuracy, and uniformity, were assessed following international standards. Task-based analysis was also performed through target transfer function (TTF), noise power spectrum (NPS), and detectability index (d′) to evaluate diagnostic relevance. Results: UHR protocols provided markedly improved spatial resolution, particularly in the inner ear imaging, as confirmed by TTF analysis, though with increased noise and reduced low-contrast detectability in certain conditions. CT numbers showed linear correspondence with known attenuation coefficients across all protocols. Conclusions: This study establishes a detailed technical characterization of the first clinical PCCT scanner, demonstrating significant improvements in terms of spatial resolution and accuracy of the quantitative image analysis, while highlighting the need for noise–contrast optimization in high-resolution imaging. Full article
(This article belongs to the Special Issue Recent Progress in X-Ray Medical Imaging and Detectors)
Show Figures

Figure 1

18 pages, 8164 KB  
Article
Development and Characterization of a Biodegradable Radiopaque PLA/Gd2O3 Filament for Bone-Equivalent Phantom Produced via Fused Filament Fabrication
by Özkan Özmen and Sena Dursun
Polymers 2025, 17(23), 3193; https://doi.org/10.3390/polym17233193 - 30 Nov 2025
Viewed by 610
Abstract
Additive manufacturing (AM) has rapidly evolved due to its design flexibility, ability to enable personalized fabrication, and reduced material waste. In the medical field, fused filament fabrication (FFF) facilitates the production of individualized anatomical models for surgical preparation, education, medical imaging, and calibration. [...] Read more.
Additive manufacturing (AM) has rapidly evolved due to its design flexibility, ability to enable personalized fabrication, and reduced material waste. In the medical field, fused filament fabrication (FFF) facilitates the production of individualized anatomical models for surgical preparation, education, medical imaging, and calibration. However, the lack of filaments with X-ray attenuation similar to that of biological hard tissues limits their use in radiological imaging. To address this limitation, a radiopaque filament was developed by incorporating gadolinium oxide (Gd2O3) into a biodegradable poly(lactic acid) (PLA) matrix at 1, 3, and 5 wt.%. Thermal and rheological properties were characterized using differential scanning calorimetry (DSC), thermogravimetric analysis (TGA), and melt flow index (MFI) analyses, revealing minor variations that did not affect printability under standard FFF conditions (200 °C nozzle, 60 °C build plate, 0.12 mm layer height). Microstructural analysis via field emission scanning electron microscopy (FESEM), energy-dispersive X-ray spectroscopy (EDX), elemental mapping, and micro-computed tomography (micro-CT) confirmed homogeneous Gd2O3 dispersion without nozzle blockage. Radiopacity was evaluated using gyroid infill cubes, and increasing Gd2O3 content enhanced X-ray attenuation, with 3 wt.% Gd2O3 reaching Hounsfield Unit (HU) values comparable to cortical bone. Finally, the L1 vertebra phantom fabricated from the 3 wt.% Gd2O3 filament exhibited mean HU values of approximately +200 to +250 HU at 50% infill density (trabecular bone region) and around +1000 HU at 100% infill density (cortical bone region), demonstrating the filament’s potential for producing cost-effective, radiopaque, and biodegradable phantoms for computed tomography (CT) imaging. Full article
(This article belongs to the Special Issue Latest Progress in the Additive Manufacturing of Polymeric Materials)
Show Figures

Figure 1

17 pages, 754 KB  
Article
Non-Linear f(Q,T) Gravity and the Late-Time Acceleration of the Universe
by Alnadhief H. A. Alfedeel
Universe 2025, 11(12), 382; https://doi.org/10.3390/universe11120382 - 21 Nov 2025
Viewed by 315
Abstract
This study examines cosmic acceleration in the framework of f(Q,T) gravity and compares it to the standard ΛCDM model. It considers a generalized nonlinear form of the nonmetricity, expressed as [...] Read more.
This study examines cosmic acceleration in the framework of f(Q,T) gravity and compares it to the standard ΛCDM model. It considers a generalized nonlinear form of the nonmetricity, expressed as f(Q,T)=Q+α0Q2/H02+β0T+η0, where α0,β0, and η0 are constants, and H0 is the current value of the Hubble constant. In the solution process, we did not rely on any additional conditions to solve the field equations; instead, the field equations were reduced to a time-dependent closed system of nonlinear first-order coupled differential equations for H and ρ. Subsequently, these differential equations were converted to the redshift space for numerical integration alongside the Runge–Kutta method. Furthermore, the study demonstrates that the deceleration parameter q changes sign from being positive in an early period of time at high redshift values to a negative value, passing through a transitional redshift zt[0.766,0.769,0.771] and zt[0.521,0.770,1.010], reaching their current values at q0=[0.61,0.60,0.59] and [0.455,0.595,0.694] for different values of β0 and α0, respectively. Similarly, the effective equation of state weff shifted from the matter-dominated phase weff=0 at high redshift to a quintessence-like behavior at low redshift. Moreover, a super-accelerated or phantom-like regime with q01.59 and weff,01.40 was obtained when α0=0.55 and β0=0.60 were employed. The model analysis reveals that the universe is presently experiencing an accelerating expansion phase, propelled by a quintessence-type and phantom-like dark energy component, as corroborated by the Om(z) diagnostic test. The results obtained were strongly consistent with the concordance ΛCDM model. Full article
(This article belongs to the Special Issue Astrophysics and Cosmology at High Z)
Show Figures

Figure 1

18 pages, 2681 KB  
Article
Advancing Internal Dosimetry in Personalized Nuclear Medicine: Toward Optimized Radiopharmaceutical Use in Clinical Practice
by Ali H. D. Alshehri
Pharmaceuticals 2025, 18(11), 1741; https://doi.org/10.3390/ph18111741 - 17 Nov 2025
Viewed by 928
Abstract
Background: Quantifying absorbed doses from radiopharmaceuticals within human organs necessitates advanced computational modeling, as direct in vivo measurement remains impractical. Methods: In this study, three Monte Carlo-based simulation codes, Monte Carlo N-Particle version 6 (MCNP6), GEANT4 Application for Tomographic Emission (GATE), and GEANT4-based [...] Read more.
Background: Quantifying absorbed doses from radiopharmaceuticals within human organs necessitates advanced computational modeling, as direct in vivo measurement remains impractical. Methods: In this study, three Monte Carlo-based simulation codes, Monte Carlo N-Particle version 6 (MCNP6), GEANT4 Application for Tomographic Emission (GATE), and GEANT4-based Architecture for Medicine-Oriented Simulations (GAMOS), were employed to evaluate internal dosimetry following the Medical Internal Radiation Dose (MIRD) formalism. As an illustrative case, simulations were first performed for 99mTc-MIBI uptake in the myocardium using the anthropomorphic phantom, with the heart modeled as the source organ to assess energy deposition in key target organs. Dose assessments were conducted at two time points: immediately post-injection and at 60 min post-injection (representing the cardiac rest phase), allowing comparison against established clinical reference data. Results: Across all codes, organ-specific dose distributions exhibited strong consistency. The pancreas absorbed the highest dose (GATE: 21%, GAMOS: 20%, MCNP6: 22%), followed by the gallbladder (GATE: 18%, GAMOS: 17%, MCNP6: 18%) and kidneys (GATE: 16%, GAMOS: 15%, MCNP6: 16%). These findings established a consistent organ dose ranking: pancreas > gallbladder > kidneys > spleen > heart/liver, corroborating previously published empirical data. To demonstrate the versatility of the framework, additional simulations were performed with 18F in an anthropomorphic phantom and with spherical tumor models using therapeutic radionuclides (177Lu and 225Ac). This broader application underscores the adaptability of the tri-code approach for both diagnostic and therapeutic scenarios. Conclusions: This comparative analysis highlights the complementary advantages of each Monte Carlo platform. GATE is well-suited for high-fidelity clinical applications where anatomical and physical accuracy are critical. GAMOS proves advantageous for rapid prototyping and iterative modeling workflows. MCNP6 remains a reliable benchmark tool, particularly effective in scenarios requiring robust radiation transport validation. Together, these Monte Carlo frameworks form a validated and adaptable toolkit for advancing internal dosimetry in personalized nuclear medicine, supporting both clinical decision-making and the development of safer, more effective radiopharmaceutical therapies. Full article
(This article belongs to the Section Radiopharmaceutical Sciences)
Show Figures

Figure 1

46 pages, 694 KB  
Review
The Two-Measure Theory and an Overview of Some of Its Manifestations
by Alexander B. Kaganovich
Universe 2025, 11(11), 376; https://doi.org/10.3390/universe11110376 - 13 Nov 2025
Viewed by 422
Abstract
The Two-Measure Theory (TMT) has been developing since 1998 and has yielded a number of highly interesting results, including those not realized in traditional field theory models. The most important advantage of TMT as an alternative theory is that, under the conditions under [...] Read more.
The Two-Measure Theory (TMT) has been developing since 1998 and has yielded a number of highly interesting results, including those not realized in traditional field theory models. The most important advantage of TMT as an alternative theory is that, under the conditions under which all classical tests of general relativity are performed, TMT models are able to accurately reproduce Einstein’s general relativity. Despite this, TMT is still often perceived as something too exotic to be relevant to reality. In fact, the fundamental idea underlying TMT seems undeniable: if we truly believe in the effectiveness of mathematics in studying nature, we must agree that there must be a correspondence between the fundamental laws of nature and the structure of the mathematical apparatus necessary to adequately describe them. It then turns out that there is no reason to ignore the volume measure existing on the differentiable manifold on which the theory of gravity and matter fields is built. This idea has far-reaching implications. The goals of this paper are (1) to provide a clear mathematical and conceptual justification for TMT and (2) to collect in a single article some of the main results of TMT obtained over the past 25 years. Full article
(This article belongs to the Special Issue Modified Gravity and Dark Energy Theories)
Show Figures

Figure 1

25 pages, 2655 KB  
Article
Characterization of Breast Microcalcifications Using Dual-Energy CBCT: Impact of Detector Configuration on Imaging Performance—A Simulation Study
by Evangelia Karali, Christos Michail, George Fountos, Nektarios Kalyvas and Ioannis Valais
Sensors 2025, 25(22), 6853; https://doi.org/10.3390/s25226853 - 9 Nov 2025
Viewed by 924
Abstract
Microcalcifications (HAp, CaCO3, and CaC2O4) in breast tissue may indicate malignancy. Early-stage breast cancer diagnosis may benefit from the clinical application of dual-energy techniques. Dual-energy cone-beam computed tomography (CBCT) could strongly contribute to an accurate diagnosis, especially [...] Read more.
Microcalcifications (HAp, CaCO3, and CaC2O4) in breast tissue may indicate malignancy. Early-stage breast cancer diagnosis may benefit from the clinical application of dual-energy techniques. Dual-energy cone-beam computed tomography (CBCT) could strongly contribute to an accurate diagnosis, especially in dense breasts. This study focused on photon-counting detector alternatives to the standard cesium iodide (CsI) that CBCT currently relies on and investigated potential advantages over the employed CsI scintillators. Denser detector materials with a higher effective atomic number than CsI could improve image quality. A micro-CBCT was simulated in GATE using seven different detector configurations (CsI, bismuth germanate (BGO), lutetium oxyorthosilicate (LSO), lutetium–yttrium oxyorthosilicate (LYSO), gadolinium aluminum gallium garnet (GAGG), lanthanum bromide (LaBr3), and cadmium zinc telluride (CZT)) and four breast tissue phantoms containing microcalcifications of both type I and type II. The dual-energy methodology was applied to planar and tomographic acquisition data. Tomographic data were reconstructed using filtered backprojection (FBP) and the ordered-subsets expectation-maximization (OSEM) algorithm. Image quality was measured using contrast-to-noise ratio (CNR) values. Both monoenergetic and polyenergetic models were considered. CZT and GAGG crystals presented higher CNR values than CsI. HAp microcalcifications exhibited the highest CNR values, which, when accompanied by OSEM, could be distinguished for classification. Detector configurations based on CZT or GAGG crystals could be adequate alternatives to CsI in dual-energy CBCT. Full article
Show Figures

Figure 1

13 pages, 18580 KB  
Article
Optimization of Gamma Image Quality Through Experimental Evaluation Using 3D-Printed Phantoms Across Energy Window Levels
by Chanrok Park, Joowan Hong and Min-Gwan Lee
Bioengineering 2025, 12(11), 1211; https://doi.org/10.3390/bioengineering12111211 - 6 Nov 2025
Viewed by 664
Abstract
Energy window selection is a critical parameter for optimizing planar gamma image quality in nuclear medicine. In this study, we developed dedicated nuclear medicine phantoms using 3D printing technology to evaluate the impact of varying energy window levels on image quality. Three types [...] Read more.
Energy window selection is a critical parameter for optimizing planar gamma image quality in nuclear medicine. In this study, we developed dedicated nuclear medicine phantoms using 3D printing technology to evaluate the impact of varying energy window levels on image quality. Three types of phantoms—a Derenzo phantom with six different sphere diameters, a modified Hoffman phantom incorporating lead for attenuation, and a quadrant bar phantom with four bar thicknesses constructed from bronze filament—were fabricated using Fusion 360 and an Ultimaker S5 3D printer with PLA and bronze-based materials. Planar images were acquired using 37 MBq of Tc-99m for 60 s at energy windows centered at 122, 140, and 159 keV. Quantitative assessments included contrast-to-noise ratio (CNR), coefficient of variation (COV), peak signal-to-noise ratio (PSNR), and structural similarity index measure (SSIM), comparing all images with the 140 keV image as the reference. The results showed a consistent decline in image quality at 122 keV and 159 keV, with the highest CNR, lowest COV, and optimal PSNR/SSIM values obtained at 140 keV. In visual analysis using the quadrant bar phantom, thinner bars were more clearly discernible at 140 keV than at other energy levels. These findings demonstrate that the application of an appropriate energy window—particularly 140 keV for Tc-99m—substantially improves image quality in planar gamma imaging. The use of customized, material-specific 3D-printed phantoms also enables flexible, reproducible evaluation protocols for energy-dependent imaging optimization and quality assurance in clinical nuclear medicine. Full article
(This article belongs to the Section Biomedical Engineering and Biomaterials)
Show Figures

Figure 1

10 pages, 40138 KB  
Article
Scatter Removal in Photon-Counting Dual-Energy Chest X-Ray Imaging Using a Moving Block Method: A Simulation Phantom Study
by Bahaa Ghammraoui and Yee Lam Elim Thompson
Sensors 2025, 25(21), 6734; https://doi.org/10.3390/s25216734 - 3 Nov 2025
Viewed by 686
Abstract
This work investigates the impact of scatter correction on photon-counting dual-energy chest radiography using a moving block method, focusing on quantifying improvements with the IEC 62220-2-1 dual-energy metrics. A modified LucAl-based chest phantom with PMMA and aluminum inserts was modeled in three sizes [...] Read more.
This work investigates the impact of scatter correction on photon-counting dual-energy chest radiography using a moving block method, focusing on quantifying improvements with the IEC 62220-2-1 dual-energy metrics. A modified LucAl-based chest phantom with PMMA and aluminum inserts was modeled in three sizes (small, standard, large) to represent different patient sizes. Monte Carlo simulations with MC-GPU and the Photon Counting Toolkit were used to simulate a CdTe photon-counting detector with two energy thresholds at 30 and 70 keV. Scatter was estimated from blocker shadows at 25 positions, interpolated across the field of view, and smoothed with a Gaussian filter (σ=5.0 mm), then subtracted separately from low- and high-energy images. Performance was evaluated using the per-feature dual-energy contrast (DEC) and the kerma-normalized dual-energy subtraction efficiency (DSE) with all acquisitions normalized to an entrance air kerma of 1 mGy to reflect clinical exposure conditions. In simulations, the moving block estimate reproduced the true scatter distribution with an average pixel-wise error of 0.4%. Scatter contamination introduced visible artifacts in the dual-energy subtraction images, particularly in aluminum-enhanced (Al-enhanced) images, and reduced contrast for target materials by up to 25%, as reflected in both DEC and DSE values at a fixed dose. Scatter correction restored image contrast, increased DEC for target materials while keeping non-target DEC low, and reduced edge artifacts across phantom sizes with the largest gains in the large phantom. These results support the moving block method as a dose-neutral strategy to improve dual-energy subtraction performance in photon-counting chest radiography. Full article
(This article belongs to the Special Issue Recent Advances in X-Ray Sensing and Imaging)
Show Figures

Figure 1

16 pages, 343 KB  
Article
Soliton Geometry of Modified Gravity Models Engaged with Strange Quark Matter Fluid and Penrose Singularity Theorem
by Mohd Danish Siddiqi and Fatemah Mofarreh
Symmetry 2025, 17(10), 1767; https://doi.org/10.3390/sym17101767 - 20 Oct 2025
Viewed by 595
Abstract
The nature of the F(R,T)-gravity in conjunction with the quark matter fluid (QMF) is examined in this research note. In the F(R,T)-gravity framework, we derive the equation [...] Read more.
The nature of the F(R,T)-gravity in conjunction with the quark matter fluid (QMF) is examined in this research note. In the F(R,T)-gravity framework, we derive the equation of state for the QMF in the form of: F(R,T)=F1(R)+F2(T) and the model of F(R)-gravity. We also discuss how the quark matter supports the Ricci solitons with a conformal vector field in F(R,T)-gravity. In this continuing work, we give estimates for the pressure and quark density in the phantom barrier period and the radiation epoch, respectively. Additionally, we use Ricci solitons to identify several black hole prospects and energy requirements for quark matter fluid spacetime (QMF-spacetime) connected with F(R,T)-gravity. Furthermore, in the F(R,T)-gravity model connected with QMF, we also discuss some applications of the Penrose singularity theorem in terms of Ricci solitons with a conformal vector field. Finally, we deduce the Schrödinger Equation using the equation of state of the F(R,T)-gravity model connected with QMF, and we uncover some constraints that imply the existence of compact quark stars of the Ia-supernova type in the QMF-spacetime with F(R,T)-gravity. Full article
(This article belongs to the Section Mathematics)
16 pages, 6154 KB  
Article
Design and Performance Assessment of a High-Resolution Small-Animal PET System
by Wei Liu, Peng Xi, Jiguo Liu, Xilong Xu, Zhaoheng Xie, Yanye Lu, Xiangxi Meng and Qiushi Ren
Bioengineering 2025, 12(10), 1119; https://doi.org/10.3390/bioengineering12101119 - 19 Oct 2025
Cited by 1 | Viewed by 670
Abstract
This work reports the performance evaluation of a newly developed small-animal positron emission tomography (PET) system based on lutetium-yttrium oxyorthosilicate (LYSO) crystals and multi-pixel photon counter (MPPC). Performance was evaluated, including spatial resolution, system sensitivity, energy resolution, scatter fraction (SF), noise–equivalent count rate [...] Read more.
This work reports the performance evaluation of a newly developed small-animal positron emission tomography (PET) system based on lutetium-yttrium oxyorthosilicate (LYSO) crystals and multi-pixel photon counter (MPPC). Performance was evaluated, including spatial resolution, system sensitivity, energy resolution, scatter fraction (SF), noise–equivalent count rate (NECR), micro-Derenzo phantom imaging, and in vivo imaging of mice and rats. The system achieved a tangential spatial resolution of 0.9 mm in the axial direction at a quarter axial offset using the three-dimensional ordered-subsets expectation maximization (3D OSEM) reconstruction algorithm. The peak sensitivity was 8.74% within a 200–750 keV energy window, with an average energy resolution of 12.5%. Scatter fractions were 12.9% and 30.0% for mouse- and rat-like phantoms, respectively. The NECR reached 878.7 kcps at 57.6 MBq for the mouse phantom and 421.4 kcps at 63.2 MBq for the rat phantom. High-resolution phantom and in vivo images confirmed the system’s capability for quantitative, high-sensitivity small-animal imaging, demonstrating its potential for preclinical molecular imaging studies. Full article
(This article belongs to the Special Issue Application of Artificial Intelligence in Oncologic PET Imaging)
Show Figures

Figure 1

44 pages, 3213 KB  
Systematic Review
A Systematic Literature Review of Machine Learning Techniques for Observational Constraints in Cosmology
by Luis Rojas, Sebastián Espinoza, Esteban González, Carlos Maldonado and Fei Luo
Galaxies 2025, 13(5), 114; https://doi.org/10.3390/galaxies13050114 - 9 Oct 2025
Viewed by 2102
Abstract
This paper presents a systematic literature review focusing on the application of machine learning techniques for deriving observational constraints in cosmology. The goal is to evaluate and synthesize existing research to identify effective methodologies, highlight gaps, and propose future research directions. Our review [...] Read more.
This paper presents a systematic literature review focusing on the application of machine learning techniques for deriving observational constraints in cosmology. The goal is to evaluate and synthesize existing research to identify effective methodologies, highlight gaps, and propose future research directions. Our review identifies several key findings: (1) Various machine learning techniques, including Bayesian neural networks, Gaussian processes, and deep learning models, have been applied to cosmological data analysis, improving parameter estimation and handling large datasets. However, models achieving significant computational speedups often exhibit worse confidence regions compared to traditional methods, emphasizing the need for future research to enhance both efficiency and measurement precision. (2) Traditional cosmological methods, such as those using Type Ia Supernovae, baryon acoustic oscillations, and cosmic microwave background data, remain fundamental, but most studies focus narrowly on specific datasets. We recommend broader dataset usage to fully validate alternative cosmological models. (3) The reviewed studies mainly address the H0 tension, leaving other cosmological challenges—such as the cosmological constant problem, warm dark matter, phantom dark energy, and others—unexplored. (4) Hybrid methodologies combining machine learning with Markov chain Monte Carlo offer promising results, particularly when machine learning techniques are used to solve differential equations, such as Einstein Boltzmann solvers, prior to Markov chain Monte Carlo models, accelerating computations while maintaining precision. (5) There is a significant need for standardized evaluation criteria and methodologies, as variability in training processes and experimental setups complicates result comparability and reproducibility. (6) Our findings confirm that deep learning models outperform traditional machine learning methods for complex, high-dimensional datasets, underscoring the importance of clear guidelines to determine when the added complexity of learning models is warranted. Full article
Show Figures

Figure 1

19 pages, 4096 KB  
Review
Review of VHEE Beam Energy Evolution for FLASH Radiation Therapy Under Ultra-High Dose Rate (UHDR) Dosimetry
by Nikolaos Gazis and Evangelos Gazis
Quantum Beam Sci. 2025, 9(4), 29; https://doi.org/10.3390/qubs9040029 - 9 Oct 2025
Viewed by 1805
Abstract
Very-high-energy electron (VHEE) beams, ranging from 50 to 300 or 400 MeV, are the subject of intense research investigation, with considerable interest concerning applications in radiation therapy due to their accurate energy deposition into large and deep-seated tissues, sharp beam edges, high sparing [...] Read more.
Very-high-energy electron (VHEE) beams, ranging from 50 to 300 or 400 MeV, are the subject of intense research investigation, with considerable interest concerning applications in radiation therapy due to their accurate energy deposition into large and deep-seated tissues, sharp beam edges, high sparing properties, and minimal radiation effects on normal tissues. The very-high-energy electron beam, which ranges from 50 to 400 MeV, and Ultra-High-Energy Electron beams up to 1–2 GeV, are considered extremely effective for human tumor therapy while avoiding the spatial requirements and cost of proton and heavy ion facilities. Many research laboratories have developed advanced testing infrastructures with VHEE beams in Europe, the USA, Japan, and other countries. These facilities aim to accelerate the transition to clinical application, following extensive simulations for beam transport that support preclinical trials and imminent clinical deployment. However, the clinical implementation of VHEE for FLASH radiation therapy requires advances in several areas, including the development of compact, stable, and efficient accelerators; the definition of sophisticated treatment plans; and the establishment of clinically validated protocols. In addition, the perspective of VHEE for accessing ultra-high dose rate (UHDR) dosimetry presents a promising procedure for the practical integration of FLASH radiotherapy for deep tumors, enhancing normal tissue sparing while maintaining the inherent dosimetry advantages. However, it has been proven that a strong effort is necessary to improve the main operational accelerator conditions, ensuring a stable beam over time and across space, as well as compact infrastructure to support the clinical implementation of VHEE for FLASH cancer treatment. VHEE-accessing ultra-high dose rate (UHDR) perspective dosimetry is integrated with FLASH radiotherapy and well-prepared cancer treatment tools that provide an advantage in modern oncology regimes. This study explores technological progress and the evolution of electron accelerator beam energy technology, as simulated by the ASTRA code, for developing VHEE and UHEE beams aimed at medical applications. FLUKA code simulations of electron beam provide dose distribution plots and the range for various energies inside the phantom of PMMA. Full article
(This article belongs to the Section Instrumentation and Facilities)
Show Figures

Figure 1

13 pages, 2731 KB  
Article
Suitability of Polyacrylamide-Based Dosimetric Gel for Proton and Carbon Ion Beam Geometric Characterization
by Riccardo Brambilla, Luca Trombetta, Gabriele Magugliani, Stefania Russo, Alessia Bazani, Eleonora Rossi, Eros Mossini, Elena Macerata, Francesco Galluccio, Mario Mariani and Mario Ciocca
Gels 2025, 11(10), 794; https://doi.org/10.3390/gels11100794 - 2 Oct 2025
Viewed by 539
Abstract
Experimental measurement of dose distributions is a pivotal step in the quality assurance of radiotherapy treatments, especially for those relying on high delivery accuracy such as hadron therapy. This study investigated the response of a polymer gel dosimeter to determine its suitability in [...] Read more.
Experimental measurement of dose distributions is a pivotal step in the quality assurance of radiotherapy treatments, especially for those relying on high delivery accuracy such as hadron therapy. This study investigated the response of a polymer gel dosimeter to determine its suitability in performing geometric beam characterizations for hadron therapy under high-quenching conditions. Different extraction energies of proton and carbon ion beams were considered. Gel dose–response linearity and long-term stability were confirmed through optical measurements. Gel phantoms were irradiated with pencil beams and analyzed via magnetic resonance imaging. A multi-echo T2-weighted sequence was used to reconstruct depth–dose profiles and transversal distributions acquired by the gels, which were benchmarked against reference data. As expected, a response-quenching effect in the Bragg peak region was noted. Nonetheless, the studied gel formulation proved reliable in acquiring the geometric characteristics of the beams, even without correcting for the quenching effect. Indeed, depth–dose distributions acquired by the gels showed an excellent agreement with measured particle range with respect to reference values, with mean discrepancies of 0.5 ± 0.2 mm. Single-spot transverse FWHM values at increasing depths also presented an average agreement within 1 mm with values determined with radiochromic films, thus supporting the excellent spatial resolving capabilities of the dosimetric gel. Full article
(This article belongs to the Special Issue Application of Gel Dosimetry)
Show Figures

Figure 1

21 pages, 503 KB  
Article
Chaplygin and Polytropic Gases Teleparallel Robertson-Walker F(T) Gravity Solutions
by Alexandre Landry
Mathematics 2025, 13(19), 3143; https://doi.org/10.3390/math13193143 - 1 Oct 2025
Cited by 1 | Viewed by 726
Abstract
This paper investigates the teleparallel Robertson–Walker (TRW) F(T) gravity solutions for a Chaplygin gas, and then for any polytropic gas cosmological source. We use the TRW F(T) gravity field equations (FEs) for each k-parameter value case [...] Read more.
This paper investigates the teleparallel Robertson–Walker (TRW) F(T) gravity solutions for a Chaplygin gas, and then for any polytropic gas cosmological source. We use the TRW F(T) gravity field equations (FEs) for each k-parameter value case and the relevant gas equation of state (EoS) to find the new teleparallel F(T) solutions. For flat k=0 cosmological case, we find analytical solutions valid for any cosmological scale factor. For curved k=±1 cosmological cases, we find new approximated teleparallel F(T) solutions for slow, linear, fast and very fast universe expansion cases summarizing by a double power-law function. All the new solutions will be relevant for future cosmological applications on dark matter, dark energy (DE) quintessence, phantom energy, Anti-deSitter (AdS) spacetimes and several other cosmological processes. Full article
(This article belongs to the Section E4: Mathematical Physics)
Show Figures

Figure 1

Back to TopTop