Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (521)

Search Parameters:
Keywords = workflow reproducibility

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 830 KB  
Article
Predicting Breast Cancer Mortality Using SEER Data: A Comparative Analysis of L1-Logistic Regression and Neural Networks
by Mayra Cruz-Fernandez, Francisco Antonio Castillo-Velásquez, Carlos Fuentes-Silva, Omar Rodríguez-Abreo, Rafael Rojas-Galván, Marcos Avilés and Juvenal Rodríguez-Reséndiz
Technologies 2026, 14(1), 66; https://doi.org/10.3390/technologies14010066 - 15 Jan 2026
Abstract
Breast cancer remains a leading cause of mortality among women worldwide, motivating the development of transparent and reproducible risk models for clinical decision making. Using the open-access SEER Breast Cancer dataset (November 2017 release), we analyzed 4005 women diagnosed between 2006 and 2010 [...] Read more.
Breast cancer remains a leading cause of mortality among women worldwide, motivating the development of transparent and reproducible risk models for clinical decision making. Using the open-access SEER Breast Cancer dataset (November 2017 release), we analyzed 4005 women diagnosed between 2006 and 2010 with infiltrating duct and lobular carcinoma (ICD-O-3 8522/3). Thirty-one clinical and demographic variables were preprocessed with one-hot encoding and z-score standardization, and the lymph node ratio was derived to characterize metastatic burden. Two supervised models, L1-regularized logistic regression and a feedforward artificial neural network, were compared under identical preprocessing, fixed 60/20/20 data splits, and stratified five-fold cross-validation. To define clinically meaningful endpoints and handle censoring, we reformulated mortality prediction as fixed-horizon classification at 3 and 5 years, and evaluated discrimination, calibration, and operating thresholds. Logistic regression demonstrated consistently strong performance, achieving test ROC-AUC values of 0.78 at 3 years and 0.75 at 5 years, with substantially superior calibration (Brier score less than or equal to 0.12, ECE less than or equal to 0.03). A structured hyperparameter search with repeated-seed evaluation identified optimal neural network architectures for each horizon, yielding test ROC-AUC values of 0.74 at 3 years and 0.73 at 5 years, but with markedly poorer calibration (ECE 0.19 to 0.23). Bootstrap analysis showed no significant AUC difference between models at 3 years, but logistic regression exhibited greater stability across folds and lower sensitivity to feature pruning. Overall, L1-regularized logistic regression provides competitive discrimination (ROC-AUC 0.75 to 0.78), markedly superior probability calibration (ECE below 0.03 versus 0.19 to 0.23 for the neural network), and approximately 40% lower cross-validation variance, supporting its use for scalable screening, risk stratification, and triage workflows on structured registry data. Full article
(This article belongs to the Section Assistive Technologies)
Show Figures

Figure 1

19 pages, 813 KB  
Review
Maca (Lepidium meyenii) as a Functional Food and Dietary Supplement: A Review on Analytical Studies
by Andreas Wasilewicz and Ulrike Grienke
Foods 2026, 15(2), 306; https://doi.org/10.3390/foods15020306 - 14 Jan 2026
Abstract
Maca (Lepidium meyenii Walp.), a Brassicaceae species native to the high Andes of Peru, has gained global attention as a functional food and herbal medicinal product due to its endocrine-modulating, fertility-enhancing, and neuroprotective properties. Although numerous studies have addressed its biological effects, [...] Read more.
Maca (Lepidium meyenii Walp.), a Brassicaceae species native to the high Andes of Peru, has gained global attention as a functional food and herbal medicinal product due to its endocrine-modulating, fertility-enhancing, and neuroprotective properties. Although numerous studies have addressed its biological effects, a systematic and up-to-date summary of its chemical constituents and analytical methodologies is lacking. This review aims to provide a critical overview of the chemical constituents of L. meyenii and to evaluate analytical studies published between 2000 and 2025, focusing on recent advances in extraction strategies and qualitative and quantitative analytical techniques for quality control. Major compound classes include macamides, macaenes, glucosinolates, and alkaloids, each contributing to maca’s multifaceted activity. Ultra-(high-)performance liquid chromatography (U(H)PLC), often coupled with ultraviolet, diode array, or mass spectrometric detection, is the primary and most robust analytical platform due to its sensitivity, selectivity, and throughput, while ultrasound-assisted extraction improves efficiency and reproducibility. Emerging techniques such as metabolomics and chemometric approaches enhance quality control by enabling holistic, multivariate assessment of complex systems and early detection of variations not captured by traditional univariate methods. As such, they provide complementary, predictive, and more representative insights into maca’s phytochemical complexity. The novelty of this review lies in its integration of conventional targeted analysis with emerging approaches, comprehensive comparison of analytical workflows, and critical discussion of variability related to phenotype, geographic origin, and post-harvest processing. By emphasizing analytical standardization and quality assessment rather than biological activity alone, this review provides a framework for quality control, authentication, and safety evaluation of L. meyenii as a functional food and dietary supplement. Full article
Show Figures

Figure 1

24 pages, 6146 KB  
Article
Feasibility of Conditional Tabular Generative Adversarial Networks for Ecologically Plausible Synthetic River Water-Quality Data: A Statistical and Ecological Similarity Assessment
by Orhan Ibram, Luminita Moraru, Simona Moldovanu, Catalina Maria Topa, Catalina Iticescu and Puiu-Lucian Georgescu
Water 2026, 18(2), 214; https://doi.org/10.3390/w18020214 - 14 Jan 2026
Abstract
Reliable biological datasets, especially those integrating biotic indices such as the Saprobic Index, are scarce, limiting machine and deep learning applications in aquatic ecosystem assessments. This study evaluates Conditional Tabular Generative Adversarial Networks (CTGANs) for generating synthetic datasets that combine physico-chemical parameters with [...] Read more.
Reliable biological datasets, especially those integrating biotic indices such as the Saprobic Index, are scarce, limiting machine and deep learning applications in aquatic ecosystem assessments. This study evaluates Conditional Tabular Generative Adversarial Networks (CTGANs) for generating synthetic datasets that combine physico-chemical parameters with a biological index (Saprobic Index) from multiple monitoring stations in the lower Danube River. Beyond univariate distributional agreement, we assess whether ecologically meaningful multivariate relationships are preserved in the synthetic tables. To support this, we propose an ecology-oriented validation workflow that combines distributional tests with correlation structure and clustering diagnostics across stations. Real monitoring datasets were statistically modelled and recreated using CTGANs, then qualitatively assessed for realism. Comparisons between synthetic and real data employed box plots, Wilcoxon rank-sum tests, correlation matrices, and K-means clustering across stations. Stable variables, including pH, total dissolved solids, and chemical oxygen demand, were well replicated, showing no significant distributional differences (p > 0.05). Conversely, dynamic parameters such as dissolved oxygen, total nitrogen, and suspended solids exhibited notable discrepancies (p < 0.05). Correlation analyses indicated that several strong associations present in the observed data (e.g., total nitrogen–nitrate and total nitrogen–electrical conductivity) were substantially weaker in the synthetic dataset. Overall, a CTGAN can reproduce several marginal patterns but may fail to preserve key ecological linkages, which constrains its use in ecological relationship-dependent inference. While promising for exploratory modelling and general trend analysis, synthetic data should be applied cautiously for studies involving seasonally influenced, biologically significant parameters. Full article
Show Figures

Figure 1

6 pages, 216 KB  
Comment
Comment on Iacobescu et al. Evaluating Binary Classifiers for Cardiovascular Disease Prediction: Enhancing Early Diagnostic Capabilities. J. Cardiovasc. Dev. Dis. 2024, 11, 396
by Mohamed Eltawil, Laura Byham-Gray, Yuane Jia, Neil Mistry, James Parrott and Suril Gohel
J. Cardiovasc. Dev. Dis. 2026, 13(1), 46; https://doi.org/10.3390/jcdd13010046 - 13 Jan 2026
Viewed by 27
Abstract
Machine learning is increasingly applied to cardiovascular disease prediction yet reported performance metrics often appear implausibly high due to methodological errors. Recent work has reported nearly perfect predictive accuracy (≈99%) using a k-Nearest Neighbors (kNN) model on CDC heart-disease data. Such performance greatly [...] Read more.
Machine learning is increasingly applied to cardiovascular disease prediction yet reported performance metrics often appear implausibly high due to methodological errors. Recent work has reported nearly perfect predictive accuracy (≈99%) using a k-Nearest Neighbors (kNN) model on CDC heart-disease data. Such performance greatly exceeds typical BRFSS-based benchmarks and strongly indicates data leakage. In this commentary, we replicate and re-analyze the original workflow, showing that the authors applied the SMOTE-ENN resampling method prior to the train/test split, thereby allowing synthetic data generated from the full dataset to contaminate the test set. Combined with an excessively small neighborhood parameter (k = 2), this produced misleadingly high accuracy. It is noted that (1) with SMOTE-ENN performed globally, synthetic samples appear nearly identical to test points, leading to near-perfect classification, and (2) this kNN choice is unusually small for a dataset of this scale and further amplifies leakage bias. Correcting the workflow by restricting oversampling to the training data or using undersampling restores realistic results, reducing predictive accuracy to approximately 80%, confirming the inflation caused by pre-split resampling and aligning with literature norms. This case underscores the critical importance of rigorous validation, transparent reporting, and leakage-free pipelines in medical AI. We outline practical guidelines for avoiding such pitfalls and ensuring reproducible, realistic, and clinically reliable machine-learning studies. Full article
18 pages, 1758 KB  
Review
Computational Workflow for Chemical Compound Analysis: From Structure Generation to Molecular Docking
by Jesus Magdiel García-Díaz, Asbiel Felipe Garibaldi-Ríos, Martha Patricia Gallegos-Arreola, Filiberto Gutiérrez-Gutiérrez, Jorge Iván Delgado-Saucedo, Moisés Martínez-Velázquez and Ana María Puebla-Pérez
Sci. Pharm. 2026, 94(1), 9; https://doi.org/10.3390/scipharm94010009 - 13 Jan 2026
Viewed by 63
Abstract
Drug discovery is a complex and expensive process in which only a small proportion of candidate molecules reach clinical approval. Computational methods, particularly computer-aided drug design (CADD), have become fundamental to accelerate and optimize early stages of discovery by integrating chemical, biological, and [...] Read more.
Drug discovery is a complex and expensive process in which only a small proportion of candidate molecules reach clinical approval. Computational methods, particularly computer-aided drug design (CADD), have become fundamental to accelerate and optimize early stages of discovery by integrating chemical, biological, and pharmacokinetic information into predictive models. This review outlines a complete computational workflow for chemical compound analysis, covering molecular structure generation, database selection, evaluation of absorption, distribution, metabolism, excretion and toxicity (ADMET), target prediction, and molecular docking. It focuses on freely accessible and web-based tools that enable reproducible, cost-effective, and scalable in silico studies. Key platforms such as PubChem, ChEMBL, RDKit, SwissADME, TargetNet, and SwissDock are highlighted as examples of how different resources can be integrated to support rational compound design and prioritization. The article also discusses essential methodological principles, data curation strategies, and common limitations in virtual screening and docking analyses. Finally, it explores future directions in computational drug discovery, including the incorporation of artificial intelligence, multi-omics integration, and quantum simulations, to enhance predictive accuracy and translational relevance. Full article
(This article belongs to the Topic Bioinformatics in Drug Design and Discovery—2nd Edition)
Show Figures

Figure 1

21 pages, 5797 KB  
Article
Dental Preparation Guides—From CAD to PRINT and CAM
by Florina Titihazan, Tareq Hajaj, Andreea Codruța Novac, Daniela Maria Pop, Cosmin Sinescu, Meda Lavinia Negruțiu, Mihai Romînu and Cristian Zaharia
Oral 2026, 6(1), 12; https://doi.org/10.3390/oral6010012 - 12 Jan 2026
Viewed by 173
Abstract
Objectives: The aim of this study was to present and describe a digital workflow integrating Digital Smile Design (DSD) with computer-aided design/computer-aided manufacturing (CAD/CAM) and additive manufacturing technologies for the fabrication of dental preparation guides, focusing on workflow feasibility, design reproducibility, and [...] Read more.
Objectives: The aim of this study was to present and describe a digital workflow integrating Digital Smile Design (DSD) with computer-aided design/computer-aided manufacturing (CAD/CAM) and additive manufacturing technologies for the fabrication of dental preparation guides, focusing on workflow feasibility, design reproducibility, and clinical handling. Materials and Methods: A digital workflow was implemented using intraoral scanning and Exocad DentalCAD 3.1 Elefsina software to design dental preparation guides based on digitally planned restorations. Preparation margins, insertion paths, and minimal material thickness were defined virtually. The guides were fabricated using both subtractive (PMMA milling) and additive (stereolithographic-based 3D printing) manufacturing techniques. Post-processing included chemical cleaning, support removal, additional light curing, and manual finishing. The evaluation was qualitative and descriptive, based on visual inspection, workflow performance, and guide adaptation to printed models. Results: The proposed digital workflow was associated with consistent fabrication of preparation guides and predictable transfer of the virtual design to the manufactured guides. Digital planning facilitated clear visualization of preparation margins and insertion axes, supporting controlled and minimally invasive tooth preparation. The workflow demonstrated good reproducibility and efficient communication between clinician and dental technician. No quantitative measurements or statistical analyses were performed. Conclusions: Within the limitations of this qualitative feasibility study, the integration of DSD with CAD/CAM and 3D printing technologies represents a viable digital approach for designing and fabricating dental preparation guides. The workflow shows potential for improving predictability and communication in restorative dentistry. Full article
Show Figures

Figure 1

21 pages, 75033 KB  
Article
From Stones to Screen: Open-Source 3D Modeling and AI Video Generation for Reconstructing the Coëby Necropolis
by Jean-Baptiste Barreau and Philippe Gouézin
Heritage 2026, 9(1), 24; https://doi.org/10.3390/heritage9010024 - 10 Jan 2026
Viewed by 207
Abstract
This study presents a comprehensive digital workflow for the archaeological investigation and heritage enhancement of the Coëby megalithic necropolis (Brittany, France). Dating to the Middle Neolithic, between the 4th and 3rd millennia BC, this chronology is established through stratigraphy, material culture, and radiocarbon [...] Read more.
This study presents a comprehensive digital workflow for the archaeological investigation and heritage enhancement of the Coëby megalithic necropolis (Brittany, France). Dating to the Middle Neolithic, between the 4th and 3rd millennia BC, this chronology is established through stratigraphy, material culture, and radiocarbon dating. Focusing on cairns TRED 8 and TRED 9, which are two excavation units, we combined field archaeology, photogrammetry, and topographic data with open-source 3D geometric modeling to reconstruct the monuments’ original volumes and test construction hypotheses. The methodology leveraged the free software Blender (version 3.0.1) and its Bagapie extension for the procedural simulation of lithic block distribution within the tumular masses, ensuring both metric accuracy and realistic texturing. Beyond static reconstruction, the research explores innovative dynamic and narrative visualization techniques. We employed the FILM model for smooth video interpolation of the construction sequences and utilized the Wan 2.1 AI model to generate immersive video scenes of Neolithic life based on archaeologically informed prompts. The entire process, from data acquisition to final visualization, was conducted using free and open-source tools, guaranteeing full methodological reproducibility and alignment with open science principles. Our results include detailed 3D reconstructions that elucidate the complex architectural sequences of the cairns, as well as dynamic visualizations that enhance the understanding of their construction logic. This study demonstrates the analytical potential of open-source 3D modelling and AI-based visualisation for megalithic archaeology. Full article
(This article belongs to the Topic 3D Documentation of Natural and Cultural Heritage)
Show Figures

Figure 1

17 pages, 1585 KB  
Review
Second-Opinion Systems for Rare Diseases: A Scoping Review of Digital Workflows and Networks
by Vinícius Lima, Mariana Mozini and Domingos Alves
Informatics 2026, 13(1), 6; https://doi.org/10.3390/informatics13010006 - 10 Jan 2026
Viewed by 180
Abstract
Introduction: Rare diseases disperse expertise across institutions and borders, making structured second-opinion systems a pragmatic way to concentrate subspecialty knowledge and reduce diagnostic delays. This scoping review mapped the design, governance, adoption, and impacts of such services across implementation scales. Objectives: To describe [...] Read more.
Introduction: Rare diseases disperse expertise across institutions and borders, making structured second-opinion systems a pragmatic way to concentrate subspecialty knowledge and reduce diagnostic delays. This scoping review mapped the design, governance, adoption, and impacts of such services across implementation scales. Objectives: To describe how second-opinion services for rare diseases are organized and governed, to characterize technological and workflow models, to summarize benefits and barriers, and to identify priority evidence gaps for implementation. Methods: Using a population–concept–context approach, we included peer-reviewed studies describing implemented second-opinion systems for rare diseases and excluded isolated case reports, purely conceptual proposals, and work outside this focus. Searches in August 2025 covered PubMed/MEDLINE, Scopus, Web of Science Core Collection, Cochrane Library, IEEE Xplore, ACM Digital Library, and LILACS without date limits and were restricted to English, Portuguese, or Spanish. Two reviewers screened independently, and the data were charted with a standardized, piloted form. No formal critical appraisal was undertaken, and the synthesis was descriptive. Results: Initiatives were clustered by scale (European networks, national programs, regional systems, international collaborations) and favored hybrid models over asynchronous and synchronous ones. Across settings, services shared reproducible workflows and provided faster access to expertise, quicker decision-making, and more frequent clarification of care plans. These improvements were enabled by transparent governance and dedicated support but were constrained by platform complexity, the effort required to assemble panels, uneven incentives, interoperability gaps, and medico-legal uncertainty. Conclusions: Systematized second-opinion services for rare diseases are feasible and clinically relevant. Progress hinges on usability, aligned incentives, and pragmatic interoperability, advancing from registries toward bidirectional electronic health record connections, alongside prospective evaluations of outcomes, equity, experience, effectiveness, and costs. Full article
(This article belongs to the Section Health Informatics)
Show Figures

Figure 1

23 pages, 17893 KB  
Article
Multimodal Control of Manipulators: Coupling Kinematics and Vision for Self-Driving Laboratory Operations
by Shifa Sulaiman, Amarnath Harikumar, Simon Bøgh and Naresh Marturi
Robotics 2026, 15(1), 17; https://doi.org/10.3390/robotics15010017 - 9 Jan 2026
Viewed by 170
Abstract
Autonomous experimental platforms increasingly rely on robust, vision-guided robotic manipulation to support reliable and repeatable laboratory operations. This work presents a modular motion-execution subsystem designed for integration into self-driving laboratory (SDL) workflows, focusing on the coupling of real-time visual perception with smooth and [...] Read more.
Autonomous experimental platforms increasingly rely on robust, vision-guided robotic manipulation to support reliable and repeatable laboratory operations. This work presents a modular motion-execution subsystem designed for integration into self-driving laboratory (SDL) workflows, focusing on the coupling of real-time visual perception with smooth and stable manipulator control. The framework enables autonomous detection, tracking, and interaction with textured objects through a hybrid scheme that couples advanced motion planning algorithms with real-time visual feedback. Kinematic analysis of the manipulator is performed using the screw theory formulations, which provide a rigorous foundation for deriving forward kinematics and the space Jacobian. These formulations are further employed to compute inverse kinematic solutions via the Damped Least Squares (DLS) method, ensuring stable and continuous joint trajectories even in the presence of redundancy and singularities. Motion trajectories toward target objects are generated using the RRT* algorithm, offering optimal path planning under dynamic constraints. Object pose estimation is achieved through a a vision workflow integrating feature-driven detection and homography-guided depth analysis, enabling adaptive tracking and dynamic grasping of textured objects. The manipulator’s performance is quantitatively evaluated using smoothness metrics, RMSE pose errors, and joint motion profiles, including velocity continuity, acceleration, jerk, and snap. Simulation results demonstrate that the proposed subsystem delivers stable, smooth, and reproducible motion execution, establishing a validated baseline for the manipulation layer of next-generation SDL architectures. Full article
(This article belongs to the Special Issue Visual Servoing-Based Robotic Manipulation)
Show Figures

Figure 1

32 pages, 4378 KB  
Review
Precision, Reproducibility, and Validation in Zebrafish Genome Editing: A Critical Review of CRISPR, Base, and Prime Editing Technologies
by Meher un Nissa, Yidong Feng, Shahid Ali and Baolong Bao
Fishes 2026, 11(1), 41; https://doi.org/10.3390/fishes11010041 - 9 Jan 2026
Viewed by 162
Abstract
The rapid evolution of CRISPR/Cas technology has transformed genome editing across biological systems in which zebrafish have emerged as a powerful vertebrate model for functional genomics and disease research. Due to its transparency, genetic similarity to humans, and suitability for large-scale screening, zebrafish [...] Read more.
The rapid evolution of CRISPR/Cas technology has transformed genome editing across biological systems in which zebrafish have emerged as a powerful vertebrate model for functional genomics and disease research. Due to its transparency, genetic similarity to humans, and suitability for large-scale screening, zebrafish is an appropriate system for translating molecular discoveries into biomedical and environmental applications. Thereby, this review highlights the recent progress in zebrafish gene editing, targeting innovations in ribonucleoprotein delivery, PAM-flexible Cas variants, and precision editors. These approaches have greatly improved editing accuracy, reduced mosaicism, and enabled efficient F0 phenotyping. In the near future, automated microinjections, optimized guide RNA design, and multi-omics validation pipelines are expected to enhance reproducibility and scalability. Although recent innovations such as ribonucleoprotein delivery, PAM-flexible Cas variants, and precision editors have expanded the zebrafish genome-editing toolkit, their benefits are often incremental and context-dependent. Mosaicism, allele complexity, and variable germline transmission remain common, particularly in F0 embryos. Precision editors enable defined nucleotide changes but typically exhibit modest efficiencies and locus-specific constraints in zebrafish. Consequently, rigorous validation, standardized workflows, and careful interpretation of F0 phenotypes remain essential. This review critically examines both the capabilities and limitations of current zebrafish gene-editing technologies, emphasizing experimental trade-offs, reproducibility challenges, and realistic use cases. Full article
(This article belongs to the Section Genetics and Biotechnology)
Show Figures

Graphical abstract

16 pages, 2278 KB  
Article
Headspace SPME GC–MS Analysis of Urinary Volatile Organic Compounds (VOCs) for Classification Under Sample-Limited Conditions
by Lea Woyciechowski, Tushar H. More, Sabine Kaltenhäuser, Sebastian Meller, Karolina Zacharias, Friederike Twele, Alexandra Dopfer-Jablonka, Tobias Welte, Thomas Illig, Georg M. N. Behrens, Holger A. Volk and Karsten Hiller
Metabolites 2026, 16(1), 57; https://doi.org/10.3390/metabo16010057 - 8 Jan 2026
Viewed by 197
Abstract
Background/Objectives: Volatile organic compounds (VOCs) are emerging as non-invasive biomarkers of metabolic and disease-related processes, yet their reliable detection from complex biological matrices such as urine remains analytically challenging. This study aimed to establish a robust, non-targeted headspace solid-phase microextraction gas chromatography–mass spectrometry [...] Read more.
Background/Objectives: Volatile organic compounds (VOCs) are emerging as non-invasive biomarkers of metabolic and disease-related processes, yet their reliable detection from complex biological matrices such as urine remains analytically challenging. This study aimed to establish a robust, non-targeted headspace solid-phase microextraction gas chromatography–mass spectrometry (HS–SPME GC–MS) workflow optimized for very small-volume urinary samples. Methods: We systematically evaluated the effects of pH adjustment and NaCl addition on VOC extraction efficiency using a 75 µm CAR/PDMS fiber and a sample volume of only 0.75 mL. Method performance was further assessed using concentration-dependent experiments with representative VOC standards and by application to real human urine samples analyzed in technical triplicates. Results: Acidification to pH 3 markedly improved extraction performance, increasing both total signal intensity and the number of detectable VOCs, whereas alkaline conditions and additional NaCl produced only minor effects. Representative VOC standards showed compound-specific linear dynamic ranges with minimal carry-over within the relevant analytical range. Application to real urine samples confirmed high analytical reproducibility, with triplicates clustering tightly in principal component analysis and most metabolites exhibiting relative standard deviations below 25%. Conclusions: The optimized HS–SPME GC–MS method enables comprehensive, non-targeted urinary VOC profiling from limited sample volumes. This workflow provides a robust analytical foundation for exploratory volatilomics studies under sample-limited conditions and supports subsequent targeted method refinement once specific compounds or chemical classes have been prioritized. Full article
(This article belongs to the Section Endocrinology and Clinical Metabolic Research)
Show Figures

Graphical abstract

29 pages, 37031 KB  
Article
Digital Replicas and 3D Virtual Reconstructions for Large Excavations in Urban Archaeology: Methods, Tools, and Techniques Drawn from the “Metro C” Case Study in Rome
by Emanuel Demetrescu, Daniele Ferdani, Bruno Fanini, Enzo D’Annibale, Simone Berto, Simona Morretta and Rossella Rea
Remote Sens. 2026, 18(2), 203; https://doi.org/10.3390/rs18020203 - 8 Jan 2026
Viewed by 274
Abstract
This contribution presents an integrated methodological pipeline for digital documentation and virtual reconstruction of large-scale urban archaeological excavations, developed through the Amba Aradam case study (Metro C line, Rome). The excavation revealed a 2nd-century A.D. military complex extending over 4770 m2 at [...] Read more.
This contribution presents an integrated methodological pipeline for digital documentation and virtual reconstruction of large-scale urban archaeological excavations, developed through the Amba Aradam case study (Metro C line, Rome). The excavation revealed a 2nd-century A.D. military complex extending over 4770 m2 at depths reaching 20 m, documented through multiple photogrammetric campaigns (2016–2018) as structures were progressively excavated and removed. We established an empirically validated texture density standard (1.26 mm2/texel) for photorealistic digital replicas suitable for immersive HMD and desktop exploration, with an explicit texture density calculation formula ensuring reproducibility. The temporal integration workflow merged 3D snapshots acquired across three excavation campaigns while maintaining geometric and chromatic consistency. Semantic documentation, through the extended matrix framework, recorded Virtual Stratigraphic Units linking archaeological evidence, comparative sources, and interpretative reasoning (paradata) for transparent virtual reconstruction. The complete pipeline, implemented through open-source 3DSC 1.4 and EMtools add-ons for Blender and Metashape v0.9 (available on GitHub), addresses specific challenges of documenting complex stratigraphic contexts within active construction environments where in situ preservation is not feasible. The spatial integration of the digital replica with previous archaeological data illuminated the urban evolution of Rome’s military topography during the 2nd–3rd centuries A.D., demonstrating the essential role of advanced digital documentation in contemporary urban archaeology. Full article
Show Figures

Figure 1

26 pages, 3672 KB  
Article
A Computational Sustainability Framework for Vegetation Degradation and Desertification Assessment in Arid Lands in Saudi Arabia
by Afaf AlAmri, Majdah Alshehri and Ohoud Alharbi
Sustainability 2026, 18(2), 641; https://doi.org/10.3390/su18020641 - 8 Jan 2026
Viewed by 116
Abstract
Vegetation degradation in arid and semi-arid regions is intensifying due to rising temperatures, declining rainfall, soil exposure, and persistent human pressures. Drylands cover over 41% of the global land surface and support nearly two billion people, making their degradation a major environmental and [...] Read more.
Vegetation degradation in arid and semi-arid regions is intensifying due to rising temperatures, declining rainfall, soil exposure, and persistent human pressures. Drylands cover over 41% of the global land surface and support nearly two billion people, making their degradation a major environmental and socio-economic concern. However, many remote sensing and GIS-based assessment approaches remain fragmented and difficult to reproduce. This study proposes a Computational Sustainability Framework for vegetation degradation assessment that integrates multi-source satellite data, biophysical indicators, automated geospatial preprocessing, and the Analytical Hierarchy Process (AHP) within a transparent and reproducible workflow. The framework comprises four phases: data preprocessing, indicator extraction and normalization, AHP-based modeling, and spatial classification with qualitative validation. The framework was applied to the Al-Khunfah and Harrat al-Harrah Protected Areas in northern Saudi Arabia using multi-source datasets for the January–April 2023 period, including Sentinel-2, Landsat-8, CHIRPS precipitation, ESA-CCI land cover, FAO soil data, and SRTM DEM. High degradation zones were associated with low NDVI (<0.079), high BSI (>0.276), and elevated LST (>49 °C), whereas low degradation areas were concentrated near wadis and relatively more fertile soils. Overall, the proposed framework provides a scalable and interpretable tool for early-stage vegetation degradation screening in arid environments, supporting the prioritization of areas for ecological investigation and restoration planning. Full article
(This article belongs to the Section Sustainable Agriculture)
Show Figures

Figure 1

25 pages, 2500 KB  
Article
Serum Protein Signatures for Breast Cancer Detection in Treatment-Naïve African American Women Using Integrated Proteomics and Pattern Analysis
by Padma P. Tadi Uppala, Elmer C. Rivera, Hyun J. Kwon and Sharon S. Lum
Sensors 2026, 26(2), 403; https://doi.org/10.3390/s26020403 - 8 Jan 2026
Viewed by 239
Abstract
Breast cancer is the leading cause of cancer-related mortality in African American (AA) women. In this study we evaluated the serum proteomic profile of AA women with breast cancer using an integrated proteomic framework with multivariate pattern analysis. Using 2D-DIGE, thousands of serum [...] Read more.
Breast cancer is the leading cause of cancer-related mortality in African American (AA) women. In this study we evaluated the serum proteomic profile of AA women with breast cancer using an integrated proteomic framework with multivariate pattern analysis. Using 2D-DIGE, thousands of serum protein spots were detected across 33 gels; 46 spots met criteria for presence, statistical significance, and differential expression. Proteins from the spots were identified by MALDI-TOF/TOF and matched in curated databases, highlighting serum biomarkers including ceruloplasmin, alpha-2-macroglobulin, complement component C3 and C6, alpha-1-antitrypsin, alpha-1B-glycoprotein, alpha-2-HS-glycoprotein and haptoglobin-related protein. LC–MS/MS analysis revealed 163 differentiating peptides after imputing and filtering 286 peptides. These were evaluated using cumulative distribution function (CDF) analysis, a nonparametric method suited for limited sample sizes. Peptide patterns were explored with Random Forest, showing concordance with CDF. The model achieved an AUC of 0.85 at the peptide level. This workflow identified differentiating proteins (CERU, A2MG, CO3, VTDB, HEMO, APOB, APOA4, CFAH, CO4A, AACT, K1C10, ITIH2, ITIH4), highlighting CERU, A2MG, and CO3 with overexpression and reproducible identification across platforms. We present an integrated, non-invasive serum protein biomarker signature panel specific to AA women, through reproducible proteomic sensor framework to support early detection and breast cancer prevention. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

26 pages, 4558 KB  
Review
Integrating Additive Manufacturing into Dental Production: Innovations, Applications and Challenges
by Maryna Yeromina, Jan Duplak, Jozef Torok, Darina Duplakova and Monika Torokova
Inventions 2026, 11(1), 7; https://doi.org/10.3390/inventions11010007 - 7 Jan 2026
Viewed by 234
Abstract
Additive manufacturing (AM) has emerged as a key enabling technology in contemporary dental manufacturing, driven by its capacity for customization, geometric complexity, and seamless integration with digital design workflows. This article presents a technology-oriented narrative review of additive manufacturing in dental implant production, [...] Read more.
Additive manufacturing (AM) has emerged as a key enabling technology in contemporary dental manufacturing, driven by its capacity for customization, geometric complexity, and seamless integration with digital design workflows. This article presents a technology-oriented narrative review of additive manufacturing in dental implant production, focusing on dominant processing routes, material systems, and emerging research trends rather than a systematic or critical appraisal of the literature. An indicative descriptive analysis of publications indexed in the Web of Science and Scopus databases between 2014 and 2024 was used to contextualize the technological development of the field and identify major research directions. Emphasis was placed on metal powder bed fusion technologies, specifically Selective Laser Melting (SLM) and Direct Metal Laser Sintering (DMLS), which enable the fabrication of titanium implants with controlled porosity and enhanced osseointegration. Ceramic AM approaches, including SLA, DLP, and PBF, are discussed in relation to their potential for aesthetic dental restorations and customized prosthetic components. The publication trend overview indicates a growing interest in ceramic AM after 2020, an increasing focus on hybrid and functionally graded materials, and persistent challenges related to standardization and the availability of long-term clinical evidence. Key technological limitations—including manufacturing accuracy, material stability, validated metrology, and process reproducibility—are highlighted alongside emerging directions such as artificial intelligence-assisted workflows, nanostructured surface modifications, and concepts enabling accelerated or immediate clinical use of additively manufactured dental restorations. Full article
(This article belongs to the Section Inventions and Innovation in Advanced Manufacturing)
Show Figures

Figure 1

Back to TopTop