Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (248)

Search Parameters:
Keywords = standard consortium

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 852 KB  
Article
Design and Interim Recruitment Outcomes of a Multi-Modal, Multi-Level Patient Navigation Intervention for Lung Cancer Screening in the Southeast U.S.
by Marvella E. Ford, Louise Henderson, Alison Brenner, Vanessa B. Sheppard, Stephanie B. Wheeler, Tiffani Collins, Monique Williams, Rosuany Vélez Acevedo, Christopher Lyu, Chyanne Summers, Courtenay Scott, Aretha R. Polite-Powers, Sharvette J. Slaughter, Dana LaForte, Darin King, Amber S. McCoy, Jessica Zserai, Sherrick S. Hill, Melanie Slan, Steve Bradley-Bull, Neusolia Valmond, Angela M. Malek, Ellen Gomez, Megan R. Ellison and Robert A. Winnadd Show full author list remove Hide full author list
Cancers 2025, 17(22), 3633; https://doi.org/10.3390/cancers17223633 - 12 Nov 2025
Abstract
Background/Objectives: Lung cancer is the leading cause of cancer death in the United States (U.S.). Virginia, South Carolina, and North Carolina are among the U.S. states with extraordinarily high rates of lung cancer mortality, particularly among Black residents. The current lung cancer screening [...] Read more.
Background/Objectives: Lung cancer is the leading cause of cancer death in the United States (U.S.). Virginia, South Carolina, and North Carolina are among the U.S. states with extraordinarily high rates of lung cancer mortality, particularly among Black residents. The current lung cancer screening guidelines, revised in 2021, support screening for younger, non-Medicare age-eligible individuals who smoke. However, their health insurance, if any, may not cover their screening. This lack of access could create more disparities in lung cancer mortality rates. Methods: To address this concern, the Virginia Commonwealth University Massey Comprehensive Cancer Center, the Medical University of South Carolina Hollings Cancer Center, and the University of North Carolina Chapel Hill Lineberger Comprehensive Cancer Center secured a four-year Stand Up To Cancer® (SU2C) grant titled “Southeastern Consortium for Lung Cancer Screening (SC3) Study” with a novel aim to test the effectiveness of a multimodal, multilevel, barrier-focused patient navigation intervention to promote lung cancer screening among Black patients from federally qualified health centers. Results: A total of 170/675 Black participants have been recruited to date. The majority of participants (n = 134; 78.82%) were aged 55–74 years. Most participants were unmarried (n = 100; 58.82%), more than half had a high school education/GED or less (n = 111; 65.29%), most currently smoked (n = 142; 83.53%), and more males than females participated (n = 107; 62.94% male). Their reported lung cancer screening barriers, addressed by the patient navigators, were cost concerns, insurance coverage issues, and recent medical history precluding screening. Conclusions with Relevance to Cancer Health Equity: This SC3 study includes a unique lung cancer screening cohort that is in direct contrast to the predominantly White cohort in the National Lung Screening Trial. The SU2C study has created a novel, community-engaged approach to lung cancer screening navigation that could become the gold standard in high-risk medically underserved populations. Full article
(This article belongs to the Special Issue Health Services Research in Cancer Care)
Show Figures

Figure 1

17 pages, 4150 KB  
Article
An International Inter-Consortium Validation of Knowledge-Based Plan Prediction Modeling for Whole Breast Radiotherapy Treatment
by Lorenzo Placidi, Peter Griffin, Roberta Castriconi, Alessia Tudda, Giovanna Benecchi, Mark Burns, Elisabetta Cagni, Cathy Markham, Valeria Landoni, Eugenia Moretti, Caterina Oliviero, Giulia Rambaldi Guidasci, Guenda Meffe, Tiziana Rancati, Alessandro Scaggion, Karen McGoldrick, Vanessa Panettieri and Claudio Fiorino
Cancers 2025, 17(21), 3576; https://doi.org/10.3390/cancers17213576 - 5 Nov 2025
Viewed by 205
Abstract
Background: Knowledge-based (KB) planning is a promising approach to model prior planning experience and optimize radiotherapy. To enable the sharing of models across institutions, their transferability must be evaluated. This study aimed to validate KB prediction models developed by a national consortium using [...] Read more.
Background: Knowledge-based (KB) planning is a promising approach to model prior planning experience and optimize radiotherapy. To enable the sharing of models across institutions, their transferability must be evaluated. This study aimed to validate KB prediction models developed by a national consortium using data from another multi-institutional consortium in a different country. Methods: Ten right whole breast tangential field (RWB-TF) models were built within the national consortium. A cohort of 20 patients from the external consortium was used for testing. Transferability was defined when the ipsilateral (IPSI) lung first principal component (PC1) was within the 10th–90th percentile of the training set. Predicted dose–volume parameters were compared with clinical dose–volume histograms (cDVHs). Results: Planning target volume (PTV) coverage strategies were comparable between the two consortia, even though significant volume differences were observed for the PTV and contralateral breast (p = 0.002 and p = 0.02, respectively). For the IPSI lung, the standard deviation of predicted mean dose/V20 Gy was 1.13 Gy/2.9% in the external consortium versus 0.55 Gy/1.6% in the training consortium. Differences between the cDVH and the predicted IPSI lung mean dose and the volume receiving more than 20 Gy (V20 Gy) were <2 Gy and <5% in 88.7% and 92.3% of cases, respectively. PC1 values fell within the 10th–90th percentile for ≥90% of patients in 6/10 models and 65–85% for the remaining 4. Conclusions: This study demonstrates the feasibility of applying RWB-TF KB models beyond the consortium in which they were developed, supporting broader clinical implementation. This retrospective study was supported by AIRC (Associazione Italiana per la Ricerca sul Cancro) and registered on ClinicalTrials.gov (NCT06317948, 12 March 2024). Full article
Show Figures

Figure 1

28 pages, 3663 KB  
Article
Understanding EV Charging Pain Points Through Deep Learning Analysis
by Jason Clifford, Mayuresh Savargaonkar, Paden Rumsey, Benny Varghese, John Smart and Casey Quinn
World Electr. Veh. J. 2025, 16(11), 606; https://doi.org/10.3390/wevj16110606 - 4 Nov 2025
Viewed by 244
Abstract
Current and potential electric vehicle (EV) owners express concerns about the charging infrastructure, mentioning non-functional chargers, prolonged charging times, inconvenient charger locations, long wait times, and high costs as major barriers. Addressing these issues often requires analyzing actual vehicle charging data, which is [...] Read more.
Current and potential electric vehicle (EV) owners express concerns about the charging infrastructure, mentioning non-functional chargers, prolonged charging times, inconvenient charger locations, long wait times, and high costs as major barriers. Addressing these issues often requires analyzing actual vehicle charging data, which is typically proprietary and inconsistent due to diverse standards and protocols. To understand and improve the EV charging experience, customer reviews are typically used to identify common customer pain points (CPPs). However, there is not a comprehensive method to map customer reviews to a standardized set of CPPs. In collaboration with the National Charging Experience (ChargeX) Consortium, this study bridges these gaps by proposing a Systematic Categorization and Analysis of Large-scale EV-charging Reviews (SCALER) framework. SCALER is an integrated, deep learning framework that segments, actively labels, analyzes, and classifies EV charging customer reviews into six CPP categories. To test its effectiveness, we used SCALER to analyze over 72,000 reviews from customers charging various EV models on different networks across the United States. SCALER achieves a classification accuracy of 92.5%, with an F1 score exceeding 85.7%. By demonstrating real-world applications of SCALER, we enhance the industry’s ability to understand and address CPPs to improve the EV charging experience. Full article
(This article belongs to the Section Charging Infrastructure and Grid Integration)
Show Figures

Figure 1

18 pages, 291 KB  
Review
Novel Treatment Concepts for Cervical Cancer—Moving Towards Personalized Therapy
by Melina Danisch, Magdalena Postl, Thomas Bartl, Christoph Grimm, Alina Sturdza, Nicole Concin and Stephan Polterauer
J. Pers. Med. 2025, 15(11), 523; https://doi.org/10.3390/jpm15110523 - 1 Nov 2025
Viewed by 488
Abstract
In recent years, several randomized controlled trials have been published regarding cervical cancer therapy and significantly changed the treatment landscape. Recent advances have improved the treatment options and allow personalized treatment concepts with escalation of treatment in high-risk disease and de-escalation with reduction [...] Read more.
In recent years, several randomized controlled trials have been published regarding cervical cancer therapy and significantly changed the treatment landscape. Recent advances have improved the treatment options and allow personalized treatment concepts with escalation of treatment in high-risk disease and de-escalation with reduction in morbidity in selected low-risk patients. This review aims to provide a comprehensive analysis of the latest landmark studies that are poised to significantly influence clinical practice. Personalized treatment concepts with careful patient selection allow de-escalation in the surgical treatment of cervical cancer. In low-risk cervical cancer patients (lesions of ≤2 cm with limited stromal invasion), simple hysterectomy (SH) was non-inferior to radical hysterectomy in terms of 3-year incidence of pelvic recurrence and was associated with a lower risk of urinary incontinence or retention and improved sexual health and quality of life. Furthermore, sentinel lymphadenectomy is constantly replacing systematic pelvic lymphadenectomy in patients with low-risk cervical cancer. In addition, further studies are necessary to clarify the role of postoperative therapy for patients with intermediate-risk cervical cancer. Starting in 2008, the EMBRACE studies assess the role of Image guided adaptive brachytherapy (IGABT) in LACC in addition to modern external beam radiotherapy concurrent to chemotherapy. The publication of the results of the EMBRACE I prospective study established MRI guided IGABT as state-of-the-art brachytherapy for LACC. EMBRACE II and additional prospective studies emerging from this consortium will address important questions in modern radiotherapy for LACC. Immune checkpoint inhibitors (CPIs) have been evaluated across various clinical settings and are expected to be utilized in numerous scenarios due to several positive randomized trials. Particularly, the combination of platinum-based chemotherapy and pembrolizumab, with or without bevacizumab, has been established as the new standard treatment for primary metastatic or recurrent PD-L1 positive high-risk cervical cancer. In locally advanced cervical cancer, two new treatment escalation regimens—neoadjuvant chemotherapy and adjuvant CPI therapy—have been evaluated in addition to chemoradiation. Furthermore, antibody-drug conjugates, such as tisotumab-vedotin, represent a promising future therapeutic option for recurrent cervical cancer. Full article
23 pages, 1313 KB  
Article
Data Component Method Based on Dual-Factor Ownership Identification with Multimodal Feature Fusion
by Shenghao Nie, Jin Shi, Xiaoyang Zhou and Mingxin Lu
Sensors 2025, 25(21), 6632; https://doi.org/10.3390/s25216632 - 29 Oct 2025
Viewed by 507
Abstract
In the booming digital economy, data circulation—particularly for massive multimodal data generated by IoT sensor networks—faces critical challenges: ambiguous ownership and broken cross-domain traceability. Traditional property rights theory, ill-suited to data’s non-rivalrous nature, leads to ownership fuzziness after multi-source fusion and traceability gaps [...] Read more.
In the booming digital economy, data circulation—particularly for massive multimodal data generated by IoT sensor networks—faces critical challenges: ambiguous ownership and broken cross-domain traceability. Traditional property rights theory, ill-suited to data’s non-rivalrous nature, leads to ownership fuzziness after multi-source fusion and traceability gaps in cross-organizational flows, hindering marketization. This study aims to establish native ownership confirmation capabilities in trusted IoT-driven data ecosystems. The approach involves a dual-factor system: the collaborative extraction of text (from sensor-generated inspection reports), numerical (from industrial sensor measurements), visual (from 3D scanning sensors), and spatio-temporal features (from GPS and IoT device logs) generates unique SHA-256 fingerprints (first factor), while RSA/ECDSA private key signatures (linked to sensor node identities) bind ownership (second factor). An intermediate state integrates these with metadata, supported by blockchain (consortium chain + IPFS) and cross-domain protocols optimized for IoT environments to ensure full-link traceability. This scheme, tailored to the characteristics of IoT sensor networks, breaks traditional ownership confirmation bottlenecks in multi-source fusion, demonstrating strong performance in ownership recognition, anti-tampering robustness, cross-domain traceability and encryption performance. It offers technical and theoretical support for standardized data components and the marketization of data elements within IoT ecosystems. Full article
Show Figures

Figure 1

34 pages, 5206 KB  
Article
Enhancing Transparency and Trust in Higher Education Institutions via Blockchain: A Conceptual Model Utilizing the Ethereum Consortium Approach
by Yerlan Kistaubayev, Francisco Liébana-Cabanillas, Aijaz A. Shaikh, Galimkair Mutanov, Olga Ussatova and Ainura Shinbayeva
Sustainability 2025, 17(20), 9350; https://doi.org/10.3390/su17209350 - 21 Oct 2025
Viewed by 527
Abstract
It has been recognized that Blockchain technology contributes to environmentally sustainable development goals (SDGs). It has emerged as a disruptive innovation capable of transforming various economic and social sectors significantly. This conceptual paper is driven by the need to explore how blockchain, specifically [...] Read more.
It has been recognized that Blockchain technology contributes to environmentally sustainable development goals (SDGs). It has emerged as a disruptive innovation capable of transforming various economic and social sectors significantly. This conceptual paper is driven by the need to explore how blockchain, specifically a consortium-based Ethereum architecture, can be integrated into higher education institutions to ensure data sovereignty, integrity, and verifiability while adhering to legal and ethical standards such as GDPR. We propose a multi-layered blockchain-based model for Kazakhstan’s Unified Platform of Higher Education (UPHE). This model employs hybrid on-chain/off-chain data storage, smart contract automation, and a Proof-of-Authority consensus mechanism to address system limitations, including data centralization and inadequate verification of academic credentials. Empirical simulations using Blockscout and Ethereum-compatible tools demonstrate the model’s feasibility and performance. This paper contributes to the growing discussion on educational blockchain applications by presenting a scalable, secure, and transparent architecture that aligns with institutional governance and Environmental, Social, and Governance (ESG) principles. It also supports the objectives of UN SDG 4 (i.e., Quality education) by fostering trust, transparency, and equitable access to verifiable educational credentials. Full article
(This article belongs to the Special Issue Emerging Technologies Implementation in Sustainable Management)
Show Figures

Figure 1

15 pages, 3174 KB  
Communication
3D Data Practices and Preservation for Humanities: A Decade of the Consortium “3D for Digital Humanities”
by Mehdi Chayani, Xavier Granier and Florent Laroche
Heritage 2025, 8(10), 435; https://doi.org/10.3390/heritage8100435 - 16 Oct 2025
Viewed by 744
Abstract
For more than a decade (2014–2025), the Consortium “3D for Digital Humanities” has been advancing the use of 3D technologies in the Humanities and Social Sciences (HSS) while structuring and supporting the research community. It now brings together more than 30 teams, primarily [...] Read more.
For more than a decade (2014–2025), the Consortium “3D for Digital Humanities” has been advancing the use of 3D technologies in the Humanities and Social Sciences (HSS) while structuring and supporting the research community. It now brings together more than 30 teams, primarily from academic research, but also increasingly from the cultural sector. Under its coordination, significant achievements have been realized, including best-practice guides, an infrastructure for the publication of 3D data, and dedicated software for documentation, dissemination, and archiving, as well as a metadata schema, all fully aligned with FAIR principles. The Consortium has developed national training programs, particularly on metadata and ethical practices, and contributed to important initiatives such as the reconstruction of Notre-Dame de Paris, while actively engaging in European projects. It has also fostered international collaborations to broaden perspectives, share methodologies, and amplify impacts. Looking ahead (2025–2033), the Consortium aims to address the environmental impact of 3D data production and storage by proposing best practices for digital sustainability and efficiency. It is also expanding the National 3D Data Repository, enhancing interoperability, and adopting emerging standards to meet evolving scientific needs. Building on its past achievements, the Consortium intends to further advance 3D research and its applications across disciplines, positioning 3D data as a key component of future scientific data clouds. Full article
Show Figures

Figure 1

19 pages, 1561 KB  
Article
Integrating Genomics and Deep Phenotyping for Diagnosing Rare Pediatric Neurological Diseases: Potential for Sustainable Healthcare in Resource-Limited Settings
by Nigara Yerkhojayeva, Nazira Zharkinbekova, Sovet Azhayev, Ainash Oshibayeva, Gulnaz Nuskabayeva and Rauan Kaiyrzhanov
Int. J. Transl. Med. 2025, 5(4), 47; https://doi.org/10.3390/ijtm5040047 - 4 Oct 2025
Viewed by 655
Abstract
Background: Rare pediatric neurological diseases (RPND) often remain undiagnosed for years, creating prolonged and costly diagnostic odysseys. Combining Human Phenotype Ontology (HPO)-based deep phenotyping with exome sequencing (ES) and reverse phenotyping offers the potential to improve diagnostic yield, accelerate diagnosis, and support sustainable [...] Read more.
Background: Rare pediatric neurological diseases (RPND) often remain undiagnosed for years, creating prolonged and costly diagnostic odysseys. Combining Human Phenotype Ontology (HPO)-based deep phenotyping with exome sequencing (ES) and reverse phenotyping offers the potential to improve diagnostic yield, accelerate diagnosis, and support sustainable healthcare in resource-limited settings. Objectives: To evaluate the diagnostic yield and clinical impact of an integrated approach combining deep phenotyping, ES, and reverse phenotyping in children with suspected RPNDs. Methods: In this multicenter observational study, eighty-one children from eleven hospitals in South Kazakhstan were recruited via the Central Asian and Transcaucasian Rare Pediatric Neurological Diseases Consortium. All patients underwent standardized HPO-based phenotyping and ES, with variant interpretation following ACMG guidelines. Reverse phenotyping and interdisciplinary discussions were used to refine clinical interpretation. Results: A molecular diagnosis was established in 34 of 81 patients (42%) based on pathogenic or likely pathogenic variants. Variants of uncertain significance (VUS) were identified in an additional 9 patients (11%), but were reported separately and not included in the diagnostic yield. Reverse phenotyping clarified or expanded clinical features in one-third of genetically diagnosed cases and provided supportive evidence in most VUS cases, although their classification remained unchanged. Conclusions: Integrating deep phenotyping, ES, and reverse phenotyping substantially improved diagnostic outcomes and shortened the diagnostic odyssey. This model reduces unnecessary procedures, minimizes delays, and provides a scalable framework for advancing equitable access to genomic diagnostics in resource-constrained healthcare systems. Full article
Show Figures

Figure 1

25 pages, 1446 KB  
Review
Lactiplantibacillus plantarum as a Psychobiotic Strategy Targeting Parkinson’s Disease: A Review and Mechanistic Insights
by Wu-Lin Chen, Fu-Sheng Deng and Ying-Chieh Tsai
Nutrients 2025, 17(19), 3047; https://doi.org/10.3390/nu17193047 - 24 Sep 2025
Viewed by 2072
Abstract
Parkinson’s disease (PD) is a progressive neurodegenerative disorder characterized by the pathological aggregation of α-synuclein (α-syn), the loss of dopaminergic neurons, and the appearance of both motor and non-motor symptoms. Emerging evidence suggests a bidirectional influence of the microbiota–gut–brain axis in PD pathogenesis, [...] Read more.
Parkinson’s disease (PD) is a progressive neurodegenerative disorder characterized by the pathological aggregation of α-synuclein (α-syn), the loss of dopaminergic neurons, and the appearance of both motor and non-motor symptoms. Emerging evidence suggests a bidirectional influence of the microbiota–gut–brain axis in PD pathogenesis, where gut dysbiosis contributes to increased intestinal barrier permeability, immune activation, chronic inflammation, oxidative stress, α-syn misfolding, and neurotransmitter imbalance. These findings are increasing interest in probiotics as microbiota-targeted interventions that restore intestinal and systemic homeostasis. Lactiplantibacillus plantarum, a probiotic species with remarkable environmental adaptability and genomic plasticity, has emerged as a promising candidate for PD management. Preclinical studies demonstrate that specific Lpb. plantarum strains, such as PS128 or CCFM405, can beneficially modulate gut microbial communities, reinforce barrier integrity, regulate bile acid metabolism, attenuate neuroinflammatory responses, and improve motor deficits in PD-like mice. In addition, Lpb. plantarum DP189 or SG5 interventions can significantly reduce α-syn aggregation in the brain via suppression of oxidative stress, modulation of neuroinflammatory responses, and activation of neurotrophic factors. Recent evidence even suggests that Lpb. plantarum-derived extracellular vesicles may possess anti-PD activity by influencing host gene expression, neuronal function, and immune modulation. Although robust clinical data are still limited, preliminary clinical trials indicate that supplementation with PS128 or certain Lpb. plantarum-contained consortiums can alleviate constipation, improve gastrointestinal function, reduce systemic inflammation, and even ameliorate motor symptoms when used alongside standard dopaminergic therapies. In this review, we provide an integrated overview of preclinical, clinical, and mechanistic insights, and evaluate the translational potential of Lpb. plantarum as a safe and diet-based strategy to target the microbiota-gut–brain axis in PD. Full article
(This article belongs to the Special Issue Probiotics and Prebiotics: Past, Present and Future)
Show Figures

Figure 1

9 pages, 952 KB  
Data Descriptor
A Framework for the Datasets of CRDS CO2 and CH4 Stable Carbon Isotope Measurements in the Atmosphere
by Francesco D’Amico, Ivano Ammoscato, Giorgia De Benedetto, Luana Malacaria, Salvatore Sinopoli, Teresa Lo Feudo, Daniel Gullì and Claudia Roberta Calidonna
Data 2025, 10(9), 150; https://doi.org/10.3390/data10090150 - 22 Sep 2025
Viewed by 460
Abstract
Accessible datasets of greenhouse gas (GHG) concentrations help define long-term trends on a global scale and also provide significant information on the characteristic variability of emission sources and sinks. The integration of stable carbon isotope measurements of carbon dioxide (CO2) and [...] Read more.
Accessible datasets of greenhouse gas (GHG) concentrations help define long-term trends on a global scale and also provide significant information on the characteristic variability of emission sources and sinks. The integration of stable carbon isotope measurements of carbon dioxide (CO2) and methane (CH4) can significantly increase the accuracy and reliability of source apportionment efforts, due to the isotopic fractionation processes and fingerprint that characterize each mechanism. Via isotopic parameters such as δ13C, the ratio of 13C to 12C compared to an international standard (VPDB, Vienna Pee Dee Belemnite), it is in fact possible to discriminate, for example, between thermogenic and microbial sources of CH4, thus ensuring a more detailed understanding of global balances. A number of stations within the Italian consortium of atmospheric observation sites have been equipped with Picarro G2201-i CRDS (Cavity Ring-Down Spectrometry) analyzers capable of measuring the stable carbon isotopic ratios of CO2 and CH4, reported as δ13C-CO2 and δ13C-CO2, respectively. The first dataset (Lamezia Terme, Calabria region) of the consortium resulting from these measurements was released, and a second dataset (Potenza, Basilicata region) from another station was also released, relying on the same format to effectively standardize these new types of datasets. This work provides details on the data, format, and methods used to generate these products and describes a framework for the format and processing of similar data products based on CRD spectroscopy. Full article
Show Figures

Figure 1

20 pages, 9451 KB  
Article
Aeration Rate in Tertiary Treatment of Anaerobic Effluent from Soft Drink Industry by Co-Cultivation Between Penicillium gravinicasei and Microalgae
by João Victor Oliveira Nascimento da Silva, Carlos Eduardo de Farias Silva, Jânio Nunes Sampaio, Bruno Roberto dos Santos, Tácia Souza da Silva, Brígida Maria Villar da Gama, Anderson Correia da Silva, Albanise Enide da Silva and Renata Maria Rosas Garcia Almeida
Fermentation 2025, 11(9), 539; https://doi.org/10.3390/fermentation11090539 - 17 Sep 2025
Viewed by 721
Abstract
The soft drink industry generates effluents with high organic loads and contaminants such as nitrogen and phosphorus, requiring sequential secondary and tertiary treatments to meet international discharge standards. Moving beyond traditional monocultures, this study developed a microbial consortium (forming microalga–fungus pellets), demonstrating a [...] Read more.
The soft drink industry generates effluents with high organic loads and contaminants such as nitrogen and phosphorus, requiring sequential secondary and tertiary treatments to meet international discharge standards. Moving beyond traditional monocultures, this study developed a microbial consortium (forming microalga–fungus pellets), demonstrating a synergistic combination due to the resistance of the pellets, enhancing the treatment efficiency, and facilitating the recovery of the microbial sludge produced. Specifically, the treatment of anaerobic effluents (tertiary treatment) from the soft drink industry using consortia of the fungus Penicillium gravinicasei and the microalgae Tetradesmus obliquus and Chlorella sp. in aerated reactors was evaluated, analyzing the impact of aeration rates (0.5–3.5 vvm) on pollutant removal and microbial sludge production. The results showed that moderate aeration rates (1.5 vvm) optimized the removal of COD (up to 92.5%), total nitrogen (TN) (up to 79.3%), and total phosphorus (TP) (up to 83.4%) in just 2.5 h. Furthermore, excessive aeration reduced treatment efficiency due to microbial stress and difficulty in forming microalga–fungus pellets. The Chlorella sp. consortium showed greater stability, while T. obliquus was more sensitive to the aeration rate. Microbial sludge production was also optimized at around 1.5 vvm, consequence of the pollutant removal, with the formation of pellets that facilitated biomass harvesting. Full article
(This article belongs to the Special Issue Cyanobacteria and Eukaryotic Microalgae (2nd Edition))
Show Figures

Figure 1

15 pages, 3856 KB  
Article
Artificial Intelligence-Based Arterial Input Function for the Quantitative Assessment of Myocardial Blood Flow and Perfusion Reserve in Cardiac Magnetic Resonance: A Validation Study
by Lara R. van der Meulen, Maud van Dinther, Amedeo Chiribiri, Jouke Smink, CRUCIAL Investigators, Walter H. Backes, Jonathan Bennett, Joachim E. Wildberger, Cian M. Scannell and Robert J. Holtackers
Diagnostics 2025, 15(18), 2341; https://doi.org/10.3390/diagnostics15182341 - 16 Sep 2025
Viewed by 623
Abstract
Background/Objectives: To validate an artificial intelligence-based arterial input function (AI-AIF) deep learning model for myocardial blood flow (MBF) quantification during stress perfusion and assess its extension to rest perfusion, enabling myocardial perfusion reserve (MPR) calculation. Methods: Sixty patients with or at [...] Read more.
Background/Objectives: To validate an artificial intelligence-based arterial input function (AI-AIF) deep learning model for myocardial blood flow (MBF) quantification during stress perfusion and assess its extension to rest perfusion, enabling myocardial perfusion reserve (MPR) calculation. Methods: Sixty patients with or at risk for vascular cognitive impairment, prospectively enrolled in the CRUCIAL consortium, underwent quantitative stress and rest myocardial perfusion imaging using a 3 T MRI system. Perfusion imaging was performed using a dual-sequence (DS) protocol after intravenous administration of 0.05 mmol/kg gadobutrol. Retrospectively, the AI-AIF was estimated from standard perfusion images using a 1-D U-Net model trained to predict an unsaturated AIF from a saturated input. MBF was quantified using Fermi function-constrained deconvolution with motion compensation. MPR was calculated as the stress-to-rest MBF ratio. MBF and MPR estimates from both AIF methods were compared using Bland–Altman analyses. Results: Complete stress and rest perfusion datasets were available for 31 patients. A bias of −0.07 mL/g/min was observed between AI-AIF and DS-AIF for stress MBF (median 2.19 vs. 2.30 mL/g/min), with concordant coronary artery disease classification based on the optimal MBF threshold in over 92% of myocardial segments and coronary arteries. Larger biases of 0.12 mL/g/min and −0.30 were observed for rest MBF (1.12 vs. 1.02 mL/g/min) and MPR (2.31 vs. 1.84), respectively, with lower concordance using the optimal MPR threshold (85% of segments, 72% of arteries). Conclusions: The AI-AIF model showed comparable performance to DS-AIF for stress MBF quantification but requires further training for accurate rest MBF and MPR assessment. Full article
(This article belongs to the Section Machine Learning and Artificial Intelligence in Diagnostics)
Show Figures

Figure 1

33 pages, 1483 KB  
Article
From Model to Mechanism: Enforcing Delegated Authority in SSI with Language-Based Security
by Muhamed Turkanović, Vid Keršič, Alen Horvat, Dominik Beron and Špela Čučko
Mathematics 2025, 13(18), 2971; https://doi.org/10.3390/math13182971 - 14 Sep 2025
Viewed by 1079
Abstract
Delegation of authority remains a critical yet insufficiently addressed capability in Self-Sovereign Identity (SSI) systems. Building on an existing delegation model that introduced the concept of a Verifiable Mandate (VM) for expressing authority and access rights, this paper extends the approach with a [...] Read more.
Delegation of authority remains a critical yet insufficiently addressed capability in Self-Sovereign Identity (SSI) systems. Building on an existing delegation model that introduced the concept of a Verifiable Mandate (VM) for expressing authority and access rights, this paper extends the approach with a rigorous formalization of delegation semantics, enabling unambiguous reasoning over roles, grants, and constraints. The formal model is aligned with standards from the World Wide Web Consortium (W3C), and its constructs are embedded into an extended credential schema that preserves compatibility with the Verifiable Credentials (VC) data model while introducing delegation-specific attributes. A generalized VM schema is defined, supporting both generic and business-specific instantiations, and ensuring structural and semantic interoperability. Policy compliance is operationalized through a policy-based enforcement architecture, where rules are authored in the Rego language and evaluated at runtime by the Open Policy Agent (OPA). The architecture incorporates trusted registries for schema and policy distribution, allowing verifiers to define and enforce context-specific delegation rules in a modular and interoperable manner. Validation through realistic scenarios, such as postal service and academic use cases, demonstrates how formal semantics, schema validation, and language-based policy enforcement can be combined to enable secure, verifiable, and context-aware delegation in SSI ecosystems. Full article
(This article belongs to the Special Issue Applied Cryptography and Blockchain Security)
Show Figures

Figure 1

27 pages, 1553 KB  
Review
The Gut Microbiome and Epigenomic Reprogramming: Mechanisms, Interactions, and Implications for Human Health and Disease
by Noelle C. Rubas, Amada Torres and Alika K. Maunakea
Int. J. Mol. Sci. 2025, 26(17), 8658; https://doi.org/10.3390/ijms26178658 - 5 Sep 2025
Cited by 1 | Viewed by 4133
Abstract
The human gut microbiome is a metabolically active and ecologically dynamic consortium that profoundly influences host physiology, in part by modulating epigenetic mechanisms such as DNA and RNA methylation. These modifications regulate gene expression and phenotypic plasticity and are shaped by a combination [...] Read more.
The human gut microbiome is a metabolically active and ecologically dynamic consortium that profoundly influences host physiology, in part by modulating epigenetic mechanisms such as DNA and RNA methylation. These modifications regulate gene expression and phenotypic plasticity and are shaped by a combination of environmental factors, such as diet, stress, xenobiotics, and bioactive microbial metabolites. Despite growing evidence linking microbial signals to host epigenetic reprogramming, the underlying molecular pathways remain incompletely understood. This review highlights recent mechanistic discoveries and conceptual advances in understanding microbiome–host epigenome interactions. We discuss evolutionarily conserved pathways through which gut microbiota regulate host methylation patterns, including one-carbon metabolism, polyamine biosynthesis, short-chain fatty acid signaling, and extracellular vesicle-mediated communication. We also examine how host factors such as aging, diet, immune activity, and sociocultural context reciprocally influence microbial composition and function. Beyond basic mechanisms, we outline translational frontiers—including biomarker discovery, live biotherapeutic interventions, fecal microbiota transplantation, and adaptive clinical trial designs—that may enable microbiome-informed approaches to disease prevention and treatment. Advances in high-throughput methylation mapping, artificial intelligence, and single-cell multi-omics are accelerating our ability to model these complex interactions at high resolution. Finally, we emphasize the importance of rigorous standardization and ethical data governance through frameworks such as the FAIR and CARE principles. Deepening our understanding of how the gut microbiome modulates host epigenetic programs offers novel opportunities for precision health strategies and equitable clinical translation. Full article
Show Figures

Figure 1

15 pages, 1799 KB  
Article
The Biological Variation in Serum ACE and CPN/CPB2 Activity in Healthy Individuals as Measured by the Degradation of Dabsylated Bradykinin—Reference Data and the Importance of Pre-Analytical Standardization
by Malte Bayer, Michael Snyder and Simone König
Proteomes 2025, 13(3), 40; https://doi.org/10.3390/proteomes13030040 - 27 Aug 2025
Viewed by 791
Abstract
Background: Bradykinin (BK) is an inflammatory mediator. The degradation of labeled synthetic BK in biofluids can be used to report on the activity of angiotensin-converting enzyme (ACE) and basic carboxypeptidases N and CBP2, for which the neuropeptide is a substrate. Clinical studies have [...] Read more.
Background: Bradykinin (BK) is an inflammatory mediator. The degradation of labeled synthetic BK in biofluids can be used to report on the activity of angiotensin-converting enzyme (ACE) and basic carboxypeptidases N and CBP2, for which the neuropeptide is a substrate. Clinical studies have shown significant changes in the serum activity of these enzymes in patients with inflammatory diseases. Methods: Here, we investigated variation in the cleavage of dabsylated synthetic BK (DBK) in serum and the formation of the major enzymatic fragments using a thin-layer chromatography-based neuropeptide reporter assay (NRA) in a large cohort of healthy volunteers from the international human Personal Omics Profiling consortium based at Stanford University. Results: Four major outcomes were reported. First, a set of NRA reference data for the healthy population was delivered, which is important for future investigations of patient sera. Second, it was shown that the measured serum degradation capacity for DBK was significantly higher in males than in females. There was no significant correlation of the NRA results with ethnicity, body mass index or overnight fasting. Third, a batch effect was noted among sampling sites (HUPO conferences). Thus, we used subcohorts rather than the entire collection for data mining. Fourth, as the low-cost and robust NRA is sensitive to enzyme activity, it provides such a necessary quick test to eliminate degraded and/or otherwise questionable samples. Conclusions: The results reiterate the critical importance of a high level of standardization in pre-analytical sample collection and processing—most notably, sample quality should be evaluated before conducting any large and expensive omics analyses. Full article
(This article belongs to the Section Proteomics Technology and Methodology Development)
Show Figures

Figure 1

Back to TopTop