Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (282)

Search Parameters:
Keywords = nonstandard analysis

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
12 pages, 2085 KB  
Article
Temperature-Dependent Plastic Behavior of ASA: Johnson–Cook Plasticity Model Calibration and FEM Validation
by Peter Palička, Róbert Huňady and Martin Hagara
Materials 2026, 19(3), 470; https://doi.org/10.3390/ma19030470 (registering DOI) - 24 Jan 2026
Abstract
Acrylonitrile Styrene Acrylate (ASA) is widely used in outdoor structural applications due to its favorable mechanical stability and weather resistance; however, its temperature-dependent plastic behavior remains insufficiently characterized for accurate numerical simulation. This study presents a non-standard method of calibrating the temperature-dependent Johnson–Cook [...] Read more.
Acrylonitrile Styrene Acrylate (ASA) is widely used in outdoor structural applications due to its favorable mechanical stability and weather resistance; however, its temperature-dependent plastic behavior remains insufficiently characterized for accurate numerical simulation. This study presents a non-standard method of calibrating the temperature-dependent Johnson–Cook (J-C) plasticity model for ASA in the practical operating temperature range below the glass transition temperature. Uniaxial tensile tests at constant strain rate 0.01 s−1 were performed at −10 °C, +23 °C, and +65 °C to characterize the effect of temperature on the material’s plastic response. The J-C parameters A, B, and n were identified for each temperature separately and globally using least-squares optimization implemented in MATLAB R2024b, showing good agreement with the experimental stress–strain curves. The calibrated parameters were subsequently implemented in Abaqus 2024 and validated through finite element simulations of the tensile tests. Numerical predictions demonstrated a very high correlation with the experimental data across all temperatures, confirming that the J-C model accurately captures the hardening behavior of ASA. The presented parameter set and calibration methodology provide a reliable basis for future simulation-driven design, forming analysis, and structural assessment of ASA components subjected to variable thermal conditions. Full article
(This article belongs to the Special Issue Recent Researches in Polymer and Plastic Processing (Second Edition))
Show Figures

Figure 1

16 pages, 7013 KB  
Article
Performance of Bacterial Concrete with Agro-Waste Capsules
by Ivanka Netinger Grubeša, Dalibor Kramarić, Dunja Šamec and Anđelko Crnoja
Appl. Sci. 2026, 16(2), 755; https://doi.org/10.3390/app16020755 - 11 Jan 2026
Viewed by 230
Abstract
This study investigates the effects of agro-waste-based capsules made from grape seeds and cherry pits on the physical, mechanical, thermal and self-healing properties of concrete. Capsule-containing mixtures were compared with a reference concrete after 28 days of water curing using both standardized and [...] Read more.
This study investigates the effects of agro-waste-based capsules made from grape seeds and cherry pits on the physical, mechanical, thermal and self-healing properties of concrete. Capsule-containing mixtures were compared with a reference concrete after 28 days of water curing using both standardized and non-standardized testing methods. Capsule incorporation reduced workability by up to 91% and altered air content depending on capsule type, increasing it by 47% for grape seed capsules and decreasing it by 65% for cherry pit capsules. Fresh concrete density was reduced by 5.5% and 6.8% for grape seed and cherry pit capsules, respectively, while hardened concrete density decreased by 11% and 9%, implying lighter structures with improved seismic resistance. Compressive strength decreased by 49% for grape seed capsules and 27% for cherry pit capsules. Thermal conductivity was reduced by 32% and 22%, respectively, indicating improved energy efficiency. Concrete with grape seed capsules showed freeze–thaw performance comparable to the reference concrete after 112 cycles, whereas concrete with cherry pit capsules exhibited superior dynamic modulus behavior, suggesting continuous crack healing, despite significant mass loss due to poor capsule–matrix bonding. SEM analysis showed no significant crack reduction, while EDS revealed calcium-rich areas in grape seed capsule concrete, indicating possible crack healing. Overall, agro-waste capsule concrete shows potential for improving seismic resistance and energy efficiency, although further research is required to clarify the self-healing effect. Full article
(This article belongs to the Special Issue Innovative Building Materials: Design, Properties and Applications)
Show Figures

Figure 1

19 pages, 690 KB  
Review
Methodologies for Assessing the Dimensional Accuracy of Computer-Guided Static Implant Surgery in Clinical Settings: A Scoping Review
by Sorana Nicoleta Rosu, Monica Silvia Tatarciuc, Anca Mihaela Vitalariu, Roxana-Ionela Vasluianu, Irina Gradinaru, Nicoleta Ioanid, Catalina Cioloca Holban, Livia Bobu, Adina Oana Armencia, Alice Murariu, Elena-Odette Luca and Ana Maria Dima
Dent. J. 2026, 14(1), 43; https://doi.org/10.3390/dj14010043 - 8 Jan 2026
Viewed by 280
Abstract
Background: Computer-guided static implant surgery (CGSIS) is widely adopted to enhance the precision of dental implant placement. However, significant heterogeneity in reported accuracy values complicates evidence-based clinical decision-making. This variance is likely attributable to a fundamental lack of standardization in the methodologies [...] Read more.
Background: Computer-guided static implant surgery (CGSIS) is widely adopted to enhance the precision of dental implant placement. However, significant heterogeneity in reported accuracy values complicates evidence-based clinical decision-making. This variance is likely attributable to a fundamental lack of standardization in the methodologies used to assess dimensional accuracy. Objective: This scoping review aimed to systematically map, synthesize, and analyze the clinical methodologies used to quantify the dimensional accuracy of CGSIS. Methods: The review was conducted in accordance with the PRISMA-ScR guidelines. A systematic search of PubMed/MEDLINE, Scopus, and Embase was performed from inception to October 2025. Clinical studies quantitatively comparing planned versus achieved implant positions in human patients were included. Data were charted on study design, guide support type, data acquisition methods, reference systems for superimposition, measurement software, and accuracy metrics. Results: The analysis of 21 included studies revealed extensive methodological heterogeneity. Key findings included the predominant use of two distinct reference systems: post-operative CBCT (n = 12) and intraoral scanning with scan bodies (n = 6). A variety of proprietary and third-party software packages (e.g., coDiagnostiX, Geomagic, Mimics) were employed for superimposition, utilizing different alignment algorithms. Critically, this heterogeneity in measurement approach directly manifests in widely varying reported values for core accuracy metrics. In addition, the definitions and reporting of core accuracy metrics—specifically global coronal deviation (range of reported means: 0.55–1.70 mm), global apical deviation (0.76–2.50 mm), and angular deviation (2.11–7.14°)—were inconsistent. For example, these metrics were also reported using different statistical summaries (e.g., means with standard deviations or medians with interquartile ranges). Conclusions: The comparability and synthesis of evidence on CGSIS accuracy are significantly limited by non-standardized measurement approaches. The reported ranges of deviation values are a direct consequence of this methodological heterogeneity, not a comparison of implant system performance. Our findings highlight an urgent need for a consensus-based minimum reporting standard for future clinical research in this field to ensure reliable and translatable evidence. Full article
(This article belongs to the Special Issue New Trends in Digital Dentistry)
Show Figures

Graphical abstract

25 pages, 3564 KB  
Systematic Review
IFC and Project Control: A Systematic Literature Review
by Davide Avogaro and Carlo Zanchetta
Buildings 2026, 16(1), 91; https://doi.org/10.3390/buildings16010091 - 25 Dec 2025
Viewed by 409
Abstract
Project control in cost estimation, time scheduling, and resource accounting remains challenging, particularly when using the open-source Industry Foundation Classes (IFCs) format. This study aims to define the state of the art in integrating these three domains. A systematic literature review was conducted, [...] Read more.
Project control in cost estimation, time scheduling, and resource accounting remains challenging, particularly when using the open-source Industry Foundation Classes (IFCs) format. This study aims to define the state of the art in integrating these three domains. A systematic literature review was conducted, using a bibliometric analysis to map and interpret scientific knowledge and research trajectories, and an inductive analysis for a detailed examination of relevant studies. The analysis highlights a lack of clarity in applying the IFC standard across project control domains, as current practices often rely on non-standardized procedures, including incorrect use of classes or properties, creation of unneeded user-defined PropertySets and properties, or reliance on proprietary software. Integration of cost, time, and resource management remains limited, and proposed technological solutions generally require coding skills that typical professionals do not possess. Additional challenges include fragmented data across multiple databases, manual assignment of time, cost, and resource information, and limited collaboration, all of which are time-consuming and error-prone. There is a critical need for clearer guidelines on IFC usage to enable standardized procedures and facilitate the development of IFC-based tools. Automating these labor-intensive tasks could improve efficiency, reduce errors, and support broader adoption of integrated project control practices. Full article
(This article belongs to the Topic Application of Smart Technologies in Buildings)
Show Figures

Figure 1

19 pages, 450 KB  
Article
Heuristics Analyses of Smart Contracts Bytecodes and Their Classifications
by Chibuzor Udokwu, Seyed Amid Moeinzadeh Mirhosseini and Stefan Craß
Electronics 2026, 15(1), 41; https://doi.org/10.3390/electronics15010041 - 22 Dec 2025
Viewed by 270
Abstract
Smart contracts are deployed and represented as bytecodes in blockchain networks, and these bytecodes are machine-readable codes. Only a small number of deployed smart contracts have their verified human-readable code publicly accessible to blockchain users. To improve the understandability of deployed smart contracts, [...] Read more.
Smart contracts are deployed and represented as bytecodes in blockchain networks, and these bytecodes are machine-readable codes. Only a small number of deployed smart contracts have their verified human-readable code publicly accessible to blockchain users. To improve the understandability of deployed smart contracts, we explored rule-based classification of smart contracts using iterative integration of fingerprints of relevant function interfaces and keywords. Our classification system included categories for standard contracts such as ERC20, ERC721, and ERC1155, and non-standard contracts like FinDApps, cross-chain, governance, and proxy. To do this, we first identified the core function fingerprints for all ERC token contracts. We then used an adapted header extractor tool to verify that these fingerprints occurred in all of the implemented functions within the bytecode. For the non-standard contracts, we took an iterative approach, identifying contract interfaces and relevant fingerprints for each specific category. To classify these contracts, we created a rule that required at least two occurrences of a relevant fingerprint keyword or interface. This rule was stricter for standard contracts: the 100% occurrence requirement ensures that we only identify compliant token contracts. For non-standard contracts, we required a minimum of two relevant fingerprint occurrences to prevent hash collisions and the unintentional use of keywords. After developing the classifier, we evaluated its performance on sample datasets. The classifier performed very well, achieving an F1 score of over 99% for standard contracts and a solid 93% for non-standard contracts. We also conducted a risk analysis to identify potential vulnerabilities that could reduce the classifier’s performance, including hash collisions, an incomplete rule set, manual verification bottlenecks, outdated data, and semantic misdirection or obfuscation of smart contract functions. To address these risks, we proposed several solutions: continuous monitoring, continuous data crawling, and extended rule refinement. The classifier’s modular design allows for these manual updates to be easily integrated. While semantic-based risks cannot be completely eliminated, symbolic execution can be used to verify the expected behavior of ERC token contract functions with a given set of inputs to identify malicious contracts. Lastly, we applied the classifier on contracts deployed Ethereum main network. Full article
Show Figures

Figure 1

54 pages, 2248 KB  
Systematic Review
Analysis Methods for Diagnosing Rare Neurodevelopmental Diseases with Episignatures: A Systematic Review of the Literature
by Albert Alegret-García, Alejandro Cáceres, Marta Sevilla-Porras, Luís A. Pérez-Jurado and Juan R. González
Biomedicines 2025, 13(12), 3043; https://doi.org/10.3390/biomedicines13123043 - 11 Dec 2025
Viewed by 966
Abstract
Background: Rare diseases (RDs) and neurodevelopmental disorders (NDDs) remain under-researched due to their low prevalence, leaving significant gaps in diagnostic strategies. Beyond next-generation sequencing, epigenetic profiling and particularly episignatures have emerged as a promising complementary diagnostic tool and for reclassifying variants of uncertain [...] Read more.
Background: Rare diseases (RDs) and neurodevelopmental disorders (NDDs) remain under-researched due to their low prevalence, leaving significant gaps in diagnostic strategies. Beyond next-generation sequencing, epigenetic profiling and particularly episignatures have emerged as a promising complementary diagnostic tool and for reclassifying variants of uncertain significance (VUS). However, clinical implementation remains limited, hindered by non-standardized methodologies and restricted data sharing that impede the development of sufficiently large datasets for robust episignature development. Methods: We conducted a systematic literature review following PRISMA 2020 guidelines to identify all studies reporting episignatures published between 2014 and 2025. The review summarizes methodological approaches used for episignature detection and implementation, as well as reports of epimutations. Results: A total of 108 studies met the inclusion criteria. All but three employed Illumina methylation arrays, mostly 450 K and EPIC versions for patient sample analysis. Three main methodological phases were identified: data quality control, episignature detection, and classification model training. Despite methodological variability across these stages, most studies demonstrated high predictive capabilities, often relying on methodologies developed by a small number of leading groups. Conclusions: Epigenetic screening has significant potential to improve diagnostic yield in RDs and NDDs. Continued methodological refinement and collaborative standardization efforts will be crucial for its successful integration into clinical practice. Nevertheless, key challenges persist, including the need for secure and ethical data-sharing frameworks, external validation, and methodological standardization. Full article
Show Figures

Figure 1

15 pages, 366 KB  
Article
Spanish Adaptation of the Career Commitment Scale: Psychometric Evidence and Associations with Stress and Health Across the Lifespan
by Tatiane Cristine Fröelich, Carmen Moret-Tatay and Manoela Ziebell de Oliveira
Healthcare 2025, 13(23), 3165; https://doi.org/10.3390/healthcare13233165 - 3 Dec 2025
Viewed by 370
Abstract
Introduction/Objectives: In the context of Spain’s persistently high job insecurity and evolving labor market, understanding how individuals sustain career engagement is critical. This study aimed to adapt and validate the Career Commitment Scale (CCS) for use in the Spanish population and examine its [...] Read more.
Introduction/Objectives: In the context of Spain’s persistently high job insecurity and evolving labor market, understanding how individuals sustain career engagement is critical. This study aimed to adapt and validate the Career Commitment Scale (CCS) for use in the Spanish population and examine its relationship with career adaptability, mental health, and stress across different age groups. Methods: Using a sample of 418 participants, exploratory and confirmatory factor analyses confirmed the CCS’s original three-factor structure, career identity, planning, and resilience, with satisfactory fit indices and strong reliability. Criterion-related validity was supported through significant positive correlations with career adaptability and negative associations with depression, anxiety, and stress. Test–retest analysis over a three-month interval showed moderate-to-strong temporal stability. Result: CFA confirmed the factor structure. A moderation analysis revealed that stress moderated the relationship between age and career resilience: older individuals demonstrated higher resilience under low stress conditions, but this benefit diminished under high stress exposure. Conclusions: These findings highlight the relevance of career commitment as a multidimensional construct closely linked to mental well-being and adaptive functioning in uncertain labor markets. The validated CCS provides a reliable tool for research and practice, offering new insights into how career motivation interacts with age and psychological stress across the lifespan. This validation has meaningful implications for organizational practices, career counseling, and public policy, as career commitment can buffer against Spain’s chronic unemployment and job precarity—particularly for younger workers and those in non-standard employment. Full article
Show Figures

Figure 1

20 pages, 5967 KB  
Article
Investigation of the Structural, Mechanical and Operational Properties of an Alloy AlSi18Cu3CrMn
by Desislava Dimova, Boyan Dochev, Karel Trojan, Kalina Kamarska, Yavor Sofronov, Mihail Zagorski, Veselin Tsonev and Antonio Nikolov
Materials 2025, 18(23), 5434; https://doi.org/10.3390/ma18235434 - 2 Dec 2025
Viewed by 420
Abstract
A non-standardized hypereutectic aluminum–silicon alloy, AlSi18Cu3CrMn, was developed. To refine the structure of the studied composition, a phosphorus modifier was used in an amount of 0.04 wt %, and a complex modifying treatment was applied by combining the chemical elements of phosphorus, titanium, [...] Read more.
A non-standardized hypereutectic aluminum–silicon alloy, AlSi18Cu3CrMn, was developed. To refine the structure of the studied composition, a phosphorus modifier was used in an amount of 0.04 wt %, and a complex modifying treatment was applied by combining the chemical elements of phosphorus, titanium, boron and beryllium (P, 0.04 wt %; Ti, 0.2 wt %; B, 0.04 wt %; Be, 0.007 wt %). To improve the mechanical and operational properties of the alloy, it was heat-treated (T6) at a temperature of 510–515 °C before quenching, with artificial aging applied at a temperature of 210 °C for 16 h. Phosphorus-modified alloy AlSi18Cu3CrMn was quenched in water at 20 °C, and the combined modified alloy was quenched in water at temperatures of 20 °C and 50 °C. By conducting a microstructural analysis, the free Si crystals and silicon crystals in the composition of the eutectic in the alloy structure were characterized, and by conducting XRD, the presence and type of secondary phases were established. The hardness of the alloy was measured, as well as the microhardness of the α-solid solution. Static uniaxial tensile testing was carried out at normal and elevated temperatures (working temperatures of 200 °C, 250 °C and 300 °C). By using a gravimetric method, the corrosion rate of the alloy in 1 M NaCl and 1 M H2SO4 was calculated. The mass wear, wear intensity and wear resistance of the studied AlSi18Cu3CrMn alloy were determined during reversible reciprocating motion in the boundary-layer lubrication regime. Full article
(This article belongs to the Special Issue High-Strength Lightweight Alloys: Innovations and Advancements)
Show Figures

Graphical abstract

35 pages, 1766 KB  
Article
Design for Manufacturing and Assembly (DfMA) in Timber Construction: Advancing Energy Efficiency and Climate Neutrality in the Built Environment
by Michał Golański, Justyna Juchimiuk, Anna Podlasek and Agnieszka Starzyk
Energies 2025, 18(23), 6332; https://doi.org/10.3390/en18236332 - 2 Dec 2025
Cited by 1 | Viewed by 904
Abstract
The objective of this article is to evaluate the viability of implementing the Design for Manufacturing and Assembly (DfMA) methodology in the design and construction of complex wooden structures with non-standard geometry. The present study incorporates an analysis of scientific literature from 2011 [...] Read more.
The objective of this article is to evaluate the viability of implementing the Design for Manufacturing and Assembly (DfMA) methodology in the design and construction of complex wooden structures with non-standard geometry. The present study incorporates an analysis of scientific literature from 2011 to 2024, in addition to selected case studies of buildings constructed using glued laminated timber and engineered wood prefabrication technology. The selection of examples was based on a range of criteria, including geometric complexity, the level of integration of digital tools (BIM, CAM, parametric design), and the efficiency of assembly processes. The implementation of DfMA principles has been shown to result in a reduction in material waste by 15–25% and a reduction in assembly time by approximately 30% when compared to traditional construction methods. The findings of the present study demonstrate that the concurrent integration of design, production, and assembly in the timber construction process enhances energy efficiency, curtails embodied carbon emissions, and fosters the adoption of circular economy principles. The analysis also reveals key implementation barriers, such as insufficient digital skills, lack of standardization, and limited availability of prefabrication facilities. The article under scrutiny places significant emphasis on the pivotal role of DfMA in facilitating the digital transformation of timber architecture and propelling sustainable construction development in the context of the circular economy. The conclusions of the study indicate a necessity for further research to be conducted on quantitative life cycle assessment (LCA, LCC) and on the implementation of DfMA on both a national and international scale. Full article
(This article belongs to the Special Issue Energy Transition Towards Climate Neutrality)
Show Figures

Figure 1

34 pages, 1724 KB  
Review
Machine Learning for Photovoltaic Power Forecasting Integrated with Energy Storage Systems: A Scientometric Analysis, Systematic Review, and Meta-Analysis
by César Rodriguez-Aburto, Jorge Montaño-Pisfil, César Santos-Mejía, Pablo Morcillo-Valdivia, Roberto Solís-Farfán, José Curay-Tribeño, Alberto Morales-Vargas, Jesús Vara-Sanchez, Ricardo Gutierrez-Tirado, Abner Vigo-Roldán, Jose Vega-Ramos, Oswaldo Casazola-Cruz, Alex Pilco-Nuñez and Antonio Arroyo-Paz
Energies 2025, 18(23), 6291; https://doi.org/10.3390/en18236291 - 29 Nov 2025
Viewed by 1656
Abstract
Photovoltaic (PV) power forecasting combined with energy storage systems (ESS) is critical for grid stability and renewable energy optimization. Machine learning (ML) techniques have shown promise in improving PV forecast accuracy and ESS operation. However, the intersection of PV forecasting and ESS control [...] Read more.
Photovoltaic (PV) power forecasting combined with energy storage systems (ESS) is critical for grid stability and renewable energy optimization. Machine learning (ML) techniques have shown promise in improving PV forecast accuracy and ESS operation. However, the intersection of PV forecasting and ESS control remains underexplored, warranting a systematic review of recent advances and evaluation of ML effectiveness in PV–ESS integration. To assess research trends in ML-based PV forecasting with ESS (scientometric analysis), synthesize state-of-the-art ML approaches for PV–ESS forecasting (systematic review), and quantify their overall predictive performance via meta-analysis of the coefficient of determination (R2). A comprehensive search of Scopus (2010–2025) was conducted following PRISMA 2020 guidelines. Studies focusing on ML-based PV power forecasting integrated with ESS were included. Multiple reviewers screened the records and extracted data. Study quality was appraised using Joanna Briggs Institute checklists. A random-effects meta-analysis of R2 was performed to aggregate model performance across studies. The search identified 227 records; 50 studies were included in the review and 5 in the meta-analysis. Publications grew rapidly after 2018, indicating increased interest in PV–ESS forecasting. Deep learning models and hybrid architectures were the most frequently studied and outperformed traditional methods, while integrating PV forecasts with ESS control consistently improved operational outcomes. Common methodological limitations were noted, such as limited external validation and non-standardized evaluation metrics. The meta-analysis found a pooled R2 ~0.95 (95% CI) with no heterogeneity (I2 = 0), and no evidence of publication bias. ML-based forecasting significantly improves PV–ESS performance, underscoring AI as a key enabler for effective PV–ESS integration. Future research should address remaining gaps and explore advanced approaches to further enhance PV–ESS outcomes. Full article
(This article belongs to the Topic Solar Forecasting and Smart Photovoltaic Systems)
Show Figures

Figure 1

31 pages, 4232 KB  
Systematic Review
Artificial Intelligence-Driven SELEX Design of Aptamer Panels for Urinary Multi-Biomarker Detection in Prostate Cancer: A Systematic and Bibliometric Review
by Ayoub Slalmi, Nabila Rabbah, Ilham Battas, Ikram Debbarh, Hicham Medromi and Abdelmjid Abourriche
Biomedicines 2025, 13(12), 2877; https://doi.org/10.3390/biomedicines13122877 - 25 Nov 2025
Viewed by 1114
Abstract
Background/Objectives: The limited specificity of prostate-specific antigen (PSA) drives unnecessary biopsies in prostate cancer (PCa). Urinary extracellular vesicles (uEVs) provide a non-invasive reservoir of tumor-derived nucleic acids and proteins. Aptamers selected by SELEX enable highly specific capture, and artificial intelligence (AI) can accelerate [...] Read more.
Background/Objectives: The limited specificity of prostate-specific antigen (PSA) drives unnecessary biopsies in prostate cancer (PCa). Urinary extracellular vesicles (uEVs) provide a non-invasive reservoir of tumor-derived nucleic acids and proteins. Aptamers selected by SELEX enable highly specific capture, and artificial intelligence (AI) can accelerate their optimization. This systematic review evaluated AI-assisted SELEX for urine-derived and exosome-enriched aptamer panels in PCa detection. Methods: Systematic searches of PubMed, Scopus, and Web of Science (1 January 2010–24 August 2025; no language restrictions) followed PRISMA 2020 and PRISMA-S. The protocol is registered on OSF (osf.io/b2y7u). After deduplication, 1348 records were screened; 129 studies met the eligibility criteria, including 34 (26.4%) integrating AI within SELEX or downstream refinement. Inclusion required at least one quantitative metric (dissociation constant Kd, SELEX cycles, limit of detection [LoD], sensitivity, specificity, or AUC). Risk of bias was appraised with QUADAS-2 (diagnostic accuracy studies) and PROBAST (prediction/machine learning models). Results: AI-assisted SELEX workflows reduced laboratory enrichment cycles from conventional 12–15 to 5–7 (≈40–55% relative reduction) and reported Kd values spanning low picomolar to upper nanomolar ranges; heterogeneity and inconsistent comparators precluded pooled estimates. Multiplex urinary panels (e.g., PCA3, TMPRSS2:ERG, miR-21, miR-375, EN2) yielded single-study AUCs between 0.70 and 0.92 with sensitivities up to 95% and specificities up to 88%; incomplete 2 × 2 contingency reporting prevented bivariate meta-analysis. LoD reporting was sparse and non-standardized despite several ultralow claims (attomolar to low femtomolar) on nanomaterial-enhanced platforms. Pre-analytical variability and absent threshold prespecification contributed to high or unclear risk (QUADAS-2). PROBAST frequently indicated high risk in participants and analysis domains. Across the included studies, lower Kd and reduced LoD improved analytical detectability; however, clinical specificity and AUC were predominantly shaped by pre-analytical control (matrix; post-DRE vs. spontaneous urine) and prespecified thresholds, so engineering gains did not consistently translate into higher diagnostic accuracy. Conclusions: AI-assisted SELEX is a promising strategy for accelerating high-affinity aptamer discovery and assembling multiplex urinary panels for PCa, but current evidence is early phase, heterogeneous, and largely single-center. Priorities include standardized uEV processing, complete 2 × 2 diagnostic reporting, multicenter external validation, calibration and decision impact analyses, and harmonized LoD and Kd reporting frameworks. Full article
Show Figures

Figure 1

27 pages, 643 KB  
Article
Fractional Modeling and Stability Analysis of Tomato Yellow Leaf Curl Virus Disease: Insights for Sustainable Crop Protection
by Mansoor Alsulami, Ali Raza, Marek Lampart, Umar Shafique and Eman Ghareeb Rezk
Fractal Fract. 2025, 9(12), 754; https://doi.org/10.3390/fractalfract9120754 - 21 Nov 2025
Viewed by 509
Abstract
Tomato Yellow Leaf Curl Virus (TYLCV) has recently caused severe economic losses in global tomato production. According to the International Plant Protection Convention (IPPC), yield reductions of 50–60% have been reported in several regions, including the Caribbean, Central America, and South Asia, with [...] Read more.
Tomato Yellow Leaf Curl Virus (TYLCV) has recently caused severe economic losses in global tomato production. According to the International Plant Protection Convention (IPPC), yield reductions of 50–60% have been reported in several regions, including the Caribbean, Central America, and South Asia, with losses in sensitive cultivars reaching up to 90–100%. In developing countries, TYLCV and mixed infections affect more than seven million hectares of tomato-growing land annually. In this study, we construct and analyze a nonlinear dynamic model describing the transmission of TYLCV, incorporating the Caputo fractional-order derivative operator. The existence and uniqueness of solutions to the proposed model are rigorously established. Equilibrium points are identified, and the Jacobian determinant approach is applied to compute the basic reproduction number, R0. Suitable Lyapunov functions are formulated to analyze the global asymptotic stability of both the disease-free and endemic equilibria. The model is numerically solved using the Grünwald–Letnikov-based nonstandard finite difference method, and simulations assess how the memory index and preventive strategies influence disease propagation. The results reveal critical factors governing TYLCV transmission and suggest effective intervention measures to guide sustainable crop protection policies. Full article
(This article belongs to the Special Issue Applications of Fractional Calculus in Modern Mathematical Modeling)
Show Figures

Figure 1

19 pages, 7609 KB  
Review
Mine Water Production, Treatment, and Utilization in the Yellow River Basin: Spatial Patterns and Sustainable Transformation Pathways
by Wenjie Li, Hao Xie, Wenjie Sun, Yunchun Han, Xiaodong Jiang, Gang Huang and Pengfei Tao
Appl. Sci. 2025, 15(23), 12353; https://doi.org/10.3390/app152312353 - 21 Nov 2025
Viewed by 499
Abstract
The Yellow River Basin faces high-intensity coal resource development and severe water scarcity. This makes the treatment and use of mine water a critical factor constraining both coal industry development and ecological security for the region. This study uses kernel density estimation and [...] Read more.
The Yellow River Basin faces high-intensity coal resource development and severe water scarcity. This makes the treatment and use of mine water a critical factor constraining both coal industry development and ecological security for the region. This study uses kernel density estimation and the Standard Deviational Ellipse model to identify the spatial pattern of mine water production. It also combines bibliometric analysis and field investigations to assess research progress and current practice for mine water treatment and use in the basin. Results show that mine water production displays strong spatial clustering, with the center of gravity shifting northward. Research is moving from an engineering-focused stage to a theory-oriented one, emphasizing systematic optimization and sustainable use. Current practices still struggle with non-standardized data, uneven treatment quality, and incomplete management systems. This research underscores the importance of improving the region’s integrated management of mine water and proposes shifting mine water from an environmental burden to a resource asset. Full article
(This article belongs to the Special Issue Hydrogeology and Regional Groundwater Flow)
Show Figures

Figure 1

31 pages, 1718 KB  
Article
A Comparative Techno-Economic Analysis of Waste Cooking Oils and Chlorella Microalgae for Sustainable Biodiesel Production
by Ahmed A. Bhran
Processes 2025, 13(11), 3526; https://doi.org/10.3390/pr13113526 - 3 Nov 2025
Cited by 1 | Viewed by 1778
Abstract
This research work presents a techno-economic assessment of biodiesel production with non-standard waste cooking oil (WCO) (brown grease of small restaurants, yellow grease of households) and semi-open Chlorella sp. microalgal cultivation, which covers the problematic areas of scale and cost-efficiency in sustainable biodiesel [...] Read more.
This research work presents a techno-economic assessment of biodiesel production with non-standard waste cooking oil (WCO) (brown grease of small restaurants, yellow grease of households) and semi-open Chlorella sp. microalgal cultivation, which covers the problematic areas of scale and cost-efficiency in sustainable biodiesel production. Cost-effective biodiesel feedstock research has been motivated by the urgency of finding sustainable sources of energy. With base-catalyzed transesterification optimized by ANOVA and response surface methodology (RSM), the present study recorded biodiesel yields of up to 99.08% in household WCO (at optimum conditions; 55 °C, 3.3 mg/g NaOH, ethanol) and 96.61% in restaurant WCO (at optimum conditions; 54 °C, 1.5 mg/g NaOH, methanol) compared to 28.6% in Chlorella sp. (semi-open photobioreactors). Concerning the two types of WCO feedstocks, the obtained equations are able to compute the biodiesel viscosity and yield, in good correlation with the experimental values, in relation to the temperature and ratio of catalyst to oil/alcohol solution. The assessed household WCO has better yield and quality as it contains fewer impurities, whereas the restaurant WCO needed to be further purified, driving up the prices. Although Chlorella biodiesel is carbon neutral, its production and extraction costs are higher, making it less economically feasible for biodiesel production. Economic analysis showed that the capital costs of household WCO, restaurant WCO, and Chlorella sp. are USD 190,000, USD 220,000, and USD 720,000, respectively, based on 1,000,000 L/year as biodiesel production rate. Low capital costs as well as byproduct glycerol income of the two investigated types of WCO play a role in their low payback periods (0.23–0.91 years) and high ROI (110–444.4%). The analysis highlights the economic and environmental benefits of WCO, especially household WCO, as a scalable biodiesel feedstock, which provides new insights into process optimization and sustainable biodiesel strategies. To enhance its sustainability and cost-effectiveness and contribute to the transition to renewable biofuels globally, future studies need to emphasize energy reduction in microalgae production and purification of restaurant WCO. Full article
(This article belongs to the Section Environmental and Green Processes)
Show Figures

Figure 1

35 pages, 1857 KB  
Review
Antioxidant Activity and Oxidative Stability of Flaxseed and Its Processed Products: A Review
by Yuliya Frolova, Roman Sobolev and Alla Kochetkova
Sci 2025, 7(4), 155; https://doi.org/10.3390/sci7040155 - 2 Nov 2025
Cited by 1 | Viewed by 2324
Abstract
Flaxseed (Linum usitatissimum) is one of the most important crops worldwide due to its nutritional and functional properties. Given the diversity of flax and its processed products, this review aimed to systematize and analyze data on their antioxidant properties, oxidative stability, [...] Read more.
Flaxseed (Linum usitatissimum) is one of the most important crops worldwide due to its nutritional and functional properties. Given the diversity of flax and its processed products, this review aimed to systematize and analyze data on their antioxidant properties, oxidative stability, and content of biologically active substances. The literature search was conducted using the following databases: Scopus and The Lens. This review examines the approaches to studying the antioxidant properties, oxidative stability, and content of biologically active substances of flax and its processed products, which are used in the food industry, highlighting the advantages and limitations of the methods employed. For the analysis of AOA and OS in flaxseeds and their processing products, the most common approach is the in vitro model. For AOA assessment, non-standardized methods such as DPPH, FRAP, and ABTS•+ are most frequently used, while standard methods for determining OS (PV, AV, p-AnV, CDs, CTs, TBARSs, OSI) are employed. However, these parameters are integral and cannot fully explain the underlying processes. In our opinion, the most promising directions for further research are the standardization of methods for analyzing the antioxidant activity (AOA) of flaxseed and its processing products. Furthermore, expanding the methodological framework will lead to a better understanding of the mechanisms of oxidative processes and how to inhibit them. An expanded set of AOA assessment methods will allow researchers not only to study the action of antioxidants but also to predict it. This is particularly relevant since the same antioxidant can exhibit both antioxidant and pro-oxidant effects. Full article
(This article belongs to the Section Chemistry Science)
Show Figures

Figure 1

Back to TopTop