Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (779)

Search Parameters:
Keywords = analytical procedure validation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
31 pages, 834 KB  
Article
Verification of the Methods of Digital Monitoring of Information Space Based on Coding Theory Tools
by Dina Shaltykova, Akhat Bakirov, Anastasiya Grishina, Mariya Kostsova, Yelizaveta Vitulyova and Ibragim Suleimenov
Computers 2026, 15(4), 260; https://doi.org/10.3390/computers15040260 - 21 Apr 2026
Abstract
This study examines the applicability of coding-theoretic tools to the digital monitoring of information space. The proposed approach treats response patterns to socially significant stimuli as binary sequences and interprets their analysis as a classification problem analogous to error correction in coding theory. [...] Read more.
This study examines the applicability of coding-theoretic tools to the digital monitoring of information space. The proposed approach treats response patterns to socially significant stimuli as binary sequences and interprets their analysis as a classification problem analogous to error correction in coding theory. To verify the feasibility of this framework, a model psychological test consisting of seven binary questions was analyzed using a procedure derived from the Hamming code (7,4). The method makes it possible to map the full space of observed answer combinations onto a smaller set of reference codewords and thereby identify stable response configurations. The obtained results show that the distributions produced after coding-based transformation are markedly non-uniform and contain recurrent maxima, indicating the presence of structured patterns in collective responses. It is also shown that permutations of question order substantially affect the resulting distributions and correlation indicators, which highlights both the sensitivity and the analytical potential of the proposed encoding scheme. The main contribution of the study is methodological: it demonstrates that error-correcting coding can be operationalized as a formal tool for detecting latent regularities in simplified monitoring data. At the same time, the present results should be regarded as proof of concept, since further work is required to validate the approach on larger datasets, compare it with baseline classification methods, and extend it to longer and multivalued response sequences. Full article
Show Figures

Figure 1

17 pages, 2395 KB  
Article
Chromatographic Determination and Antimicrobial Evaluation of Walnut (Juglans regia L.) Septa from Different Habitats
by Jurgita Luksiene, Nerija Zevzikovaite, Jurga Andreja Kazlauskaite, Mindaugas Marksa, Agne Giedraitiene, Lina Merkeviciene, Asta Kubiliene and Andrejus Zevzikovas
Plants 2026, 15(8), 1263; https://doi.org/10.3390/plants15081263 - 20 Apr 2026
Abstract
Walnut septum (WS), a major by-product of walnut processing, represents a promising source of bioactive compounds with potential antioxidant and antimicrobial properties. This study aimed to characterise the phytochemical composition of WS extracts from different habitat origins and evaluate their antimicrobial activity. Total [...] Read more.
Walnut septum (WS), a major by-product of walnut processing, represents a promising source of bioactive compounds with potential antioxidant and antimicrobial properties. This study aimed to characterise the phytochemical composition of WS extracts from different habitat origins and evaluate their antimicrobial activity. Total amino acids were profiled by gas chromatography–mass spectrometry, while phenolic compounds were analysed using high-performance liquid chromatography. Both methods were evaluated according to ICH Q2 (R2) guidelines for analytical procedure validation. The results showed a complex composition of amino acids and polyphenols, including ellagic acid and quercitrin. However, it was clear that habitat variations in WS samples had a significant impact on the quantities and composition of phenolic compounds and total amino acids in WS extracts. Antimicrobial activity was assessed against Gram-positive and Gram-negative bacterial strains. Variations in antimicrobial efficacy were associated with differences in phenolic composition and content due to habitat differences in WS sample origins. Collectively, this study highlights the WS as a valuable agro-industrial by-product with potential applications as a natural source of antimicrobial compounds in food and pharmaceutical systems. Full article
Show Figures

Graphical abstract

24 pages, 672 KB  
Systematic Review
Bloodstain Pattern Analysis in Crime Scene Investigation: A Systematic Literature Review
by Muhammad Jefri Mohd Yusof, Tharshini Chandran, Muhammad Reza Amin Reza Adnan, Eddy Saputra Rohmatul Amin, Sarah Aliah Amir Sarifudin and Nurul Ain Abu Bakar
Forensic Sci. 2026, 6(2), 38; https://doi.org/10.3390/forensicsci6020038 - 20 Apr 2026
Viewed by 33
Abstract
Background/Objectives: Bloodstain pattern analysis (BPA) is widely used in crime scene investigation (CSI), yet its practical application, evidential limits, and interpretive role are often discussed in fragmented or technique-focused terms. This systematic literature review examines how BPA is used in CSI, with [...] Read more.
Background/Objectives: Bloodstain pattern analysis (BPA) is widely used in crime scene investigation (CSI), yet its practical application, evidential limits, and interpretive role are often discussed in fragmented or technique-focused terms. This systematic literature review examines how BPA is used in CSI, with emphasis on its operational functions, interpretive scope, and scientific robustness. Methods: The review followed PRISMA 2020 guidelines. A comprehensive search was conducted in Scopus using predefined Boolean strings. After screening, eligibility assessment, and manual review, 18 peer-reviewed research articles published between 1996 and 2026 were included. Data were extracted systematically and analysed using thematic synthesis. Results: The findings show that BPA is applied in CSI as an integrated evidential pathway rather than as a single analytical procedure. Its uses include bloodstain detection and documentation, geometric reconstruction through trajectory and area-of-origin analysis, differentiation of mechanisms and sources to prevent misclassification, activity-level inference based on transfer and contact phenomena, and temporal reasoning related to trace formation. The review also highlights the role of validation infrastructures, including blood substitutes, animal analogues, and computational methods, which support training, experimentation, and reproducibility under ethical and practical constraints. Across the literature, reconstruction accuracy is shown to be sensitive to documentation quality, measurement assumptions, environmental conditions, and contextual limitations. Conclusions: Overall, BPA contributes to CSI by enabling structured, context-aware interpretation of blood evidence while remaining subject to measurement assumptions, contextual influences, and cognitive factors that may affect reconstruction outcomes. Its evidential value lies not only in reconstructing events, but also in supporting transparent, testable, and defensible forensic reasoning. Full article
Show Figures

Figure 1

20 pages, 400 KB  
Article
Transforming FHIR into an OWL Knowledge Graph for Schema-Grounded Natural-Language Querying and Exploratory Data Analysis
by Steve K. Platt, Daniel B. Hier, Borchuluun Yadamsuren, Anh N. Nguyen and Vaughn Hartzell
Appl. Sci. 2026, 16(8), 3936; https://doi.org/10.3390/app16083936 - 18 Apr 2026
Viewed by 167
Abstract
FHIR was designed for transactional interoperability but is less well suited to querying and exploratory analysis because its resource-centric structure distributes meaning across deeply nested resources. To address this limitation, we transformed MIMIC-IV Demo FHIR data into an OWL-compliant knowledge graph by flattening [...] Read more.
FHIR was designed for transactional interoperability but is less well suited to querying and exploratory analysis because its resource-centric structure distributes meaning across deeply nested resources. To address this limitation, we transformed MIMIC-IV Demo FHIR data into an OWL-compliant knowledge graph by flattening nested elements, normalizing repeating arrays, resolving inter-resource references, and promoting frequently queried attributes to direct properties. We also aligned diagnosis and procedure codes to ICD-9-CM and ICD-10-CM terminologies and developed a schema-grounded NL2SPARQL interface for natural-language querying. Structural validation was performed with SHACL and OWL reasoning. Across a curated evaluation set, NL2SPARQL achieved a mean accuracy exceeding 95% relative to expert-authored queries. These results suggest that ontologizing FHIR can improve analytic accessibility while preserving clinically meaningful assertions. Full article
(This article belongs to the Special Issue Exploring Semantic Technologies and Their Application)
Show Figures

Figure 1

16 pages, 2218 KB  
Article
Investigating the Correlation Between Front and Rear Roll Center Heights to Achieve Neutral Handling: An Iterative Design Approach Based on Experimental Tire Data
by Mădălina Boțu, Gabriel George Ursescu, Ciprian Dumitru Ciofu and Edward Rakosi
Vehicles 2026, 8(4), 92; https://doi.org/10.3390/vehicles8040092 - 17 Apr 2026
Viewed by 193
Abstract
This paper presents an iterative graph-analytical procedure for determining the roll center height, one of the most critical design parameters influencing vehicle dynamic behavior during cornering. The conventional approaches generally determine roll center locations from suspension kinematics and then evaluate vehicle behavior using [...] Read more.
This paper presents an iterative graph-analytical procedure for determining the roll center height, one of the most critical design parameters influencing vehicle dynamic behavior during cornering. The conventional approaches generally determine roll center locations from suspension kinematics and then evaluate vehicle behavior using multibody or numerical vehicle dynamics models. By contrast, the proposed method is intended for the preliminary design stage and provides a direct correlation between front and rear target roll center heights using tire test data, load transfer and axle-level equilibrium conditions. The main advantage of the method is that it helps define a feasible design space before detailed geometry optimization or MBD validation is performed. The objective is to achieve stable and neutral handling (avoiding intrinsic understeer or oversteer tendencies) during steady-state cornering at a predefined target lateral acceleration. The methodology integrates (i) lateral force equilibrium at the axle level, (ii) a dynamic load transfer model based on axle roll stiffness and roll center heights, and (iii) experimental tire grip characteristics (lateral force–slip angle curves under varying vertical loads), processed through numerical interpolation. The procedure is demonstrated using a vehicle model with specific geometric and mass parameters. The results indicate that the methodology does not yield a single unique solution, but rather a set of correlated roll center heights, allowing the designer to select the most feasible geometric configuration while maintaining neutral handling. As an example, the paper presents a convergent solution for the front and rear roll center heights that satisfy neutrality conditions at a slip angle of approximately 4°. This study provides a fundamental framework for the geometric design of suspension systems and serves as a basis for subsequent numerical and experimental validation. Full article
(This article belongs to the Special Issue Vehicle Design Processes, 3rd Edition)
Show Figures

Figure 1

15 pages, 961 KB  
Article
Minimally Invasive Therapeutic Drug Monitoring of Immunosuppressants in Children with Kidney Diseases: Validation of Fingerstick Sampling Using LC-MS/MS
by Marika Ishii, Jun Aoyagi, Natsuka Kimura, Masanori Kurosaki, Tomomi Maru, Kazuya Tanimoto, Mitsuaki Yoshino, Takane Ito, Takahiro Kanai, Hitoshi Osaka, Ryozo Nagai and Kenichi Aizawa
Pharmaceuticals 2026, 19(4), 630; https://doi.org/10.3390/ph19040630 - 16 Apr 2026
Viewed by 179
Abstract
Background/Objectives: Therapeutic drug monitoring (TDM) of immunosuppressants is essential in treating pediatric kidney diseases; however, repeated venipuncture is burdensome in children. We evaluated whether minimally invasive fingerstick capillary sampling combined with liquid chromatography–tandem mass spectrometry (LC-MS/MS) provides results analytically comparable to those [...] Read more.
Background/Objectives: Therapeutic drug monitoring (TDM) of immunosuppressants is essential in treating pediatric kidney diseases; however, repeated venipuncture is burdensome in children. We evaluated whether minimally invasive fingerstick capillary sampling combined with liquid chromatography–tandem mass spectrometry (LC-MS/MS) provides results analytically comparable to those of conventional venous sampling. Methods: Capillary whole blood (2.8 µL) was collected via fingersticks from pediatric patients receiving mycophenolate mofetil, with or without tacrolimus (TAC) or cyclosporine A (CsA). Drug concentrations were quantified using a previously validated simultaneous LC-MS/MS method and compared with conventional venous sampling using linear regression and Bland–Altman analyses. Results: Seventy-four paired samples from 21 patients were analyzed. Strong correlations were observed between capillary and venous samples for mycophenolic acid (MPA), TAC, and CsA (R2 > 0.90). Hematocrit correction improved agreement for MPA. Bland–Altman analyses demonstrated acceptable bias across analytes. Conclusions: Fingerstick-based microvolume sampling combined with LC-MS/MS provides analytically reliable immunosuppressant quantification in pediatric patients. Although larger clinical validation is required, this minimally invasive approach may reduce procedural burden and may support future outpatient or home-based TDM strategies. Full article
20 pages, 1287 KB  
Systematic Review
Neuromodulatory Interventions in Experimental Acute Pancreatitis: A Systematic Review of Rodent Studies
by Maxim Rantsev, Alexey Sarapultsev and Valeriy Chereshnev
Diseases 2026, 14(4), 145; https://doi.org/10.3390/diseases14040145 - 16 Apr 2026
Viewed by 190
Abstract
Background/Objectives: Acute pancreatitis (AP) lacks disease-modifying pharmacotherapy. Neuroimmune, serotonergic, and redox-regulated pathways may modulate inflammatory amplification and acinar injury, although pharmacovigilance data link some psychotropic drug classes to AP risk. This review synthesized controlled rodent studies evaluating neuromodulatory interventions with serotonergic, stress-axis, [...] Read more.
Background/Objectives: Acute pancreatitis (AP) lacks disease-modifying pharmacotherapy. Neuroimmune, serotonergic, and redox-regulated pathways may modulate inflammatory amplification and acinar injury, although pharmacovigilance data link some psychotropic drug classes to AP risk. This review synthesized controlled rodent studies evaluating neuromodulatory interventions with serotonergic, stress-axis, or ferroptosis-linked targets in experimental AP. Methods: PubMed, Scopus, eLIBRARY.ru, and Elicit were searched in January 2026, supplemented by Google Scholar audit and citation chasing. Eligible studies were controlled in vivo rodent experiments using validated AP models with quantitative outcomes. Intervention timing was classified a priori as a primary analytic variable. Risk of bias was assessed with SYRCLE. A prespecified audit showed that no subset met the criteria for quantitative pooling because of heterogeneity in model class, compounds, timing, outcome definitions, units, and sampling timepoints. Mechanism-stratified qualitative synthesis was therefore performed. The protocol was registered on OSF (doi: 10.17605/OSF.IO/CZXDJ). Results: Nine studies (1992–2023) yielded 410 outcome rows across three mechanistic strands. Serotonergic modulation (5-HT2/5-HT2A-focused; six studies) reduced serum amylase/lipase (−37% to −65% vs. disease controls) and histological injury, with receptor-selectivity data supporting 5-HT2A-mediated mechanisms. Stress-axis modulation with thiadiazine L-17 reduced 7-day mortality in two severe models (from 50–70% to 30%). Olanzapine attenuated ferroptosis-linked injury via off-target antioxidant activity independent of serotonergic receptors. All interventions were prophylactic, peri-induction, or very early post-induction; no delayed therapeutic-window studies were identified. Most SYRCLE domains were unclear, particularly allocation concealment and blinding-related procedures. Conclusions: Neuromodulatory pathways modulate experimental AP in rodents, but evidentiary strength differs across mechanistic strands. Inference is constrained by absent therapeutic-window testing, heterogeneous endpoints, and reporting deficits. The findings support mechanism-level target prioritization rather than clinical repurposing. Full article
(This article belongs to the Special Issue Diseases: From Molecular to the Clinical Perspectives)
Show Figures

Graphical abstract

24 pages, 527 KB  
Article
A Human–AI Collaborative Pipeline for Decision Support in Urban Development Projects Based on Large-Scale Social Media Text Analysis
by Alexander A. Kharlamov and Maria Pilgun
Technologies 2026, 14(4), 228; https://doi.org/10.3390/technologies14040228 - 14 Apr 2026
Viewed by 313
Abstract
The rapid growth of digital communication platforms has generated vast volumes of user-generated textual data and digital footprints, creating growing demand for scalable artificial intelligence systems capable of supporting evidence-based decision-making. This study proposes and evaluates a human–AI collaborative analytical pipeline for multi-class [...] Read more.
The rapid growth of digital communication platforms has generated vast volumes of user-generated textual data and digital footprints, creating growing demand for scalable artificial intelligence systems capable of supporting evidence-based decision-making. This study proposes and evaluates a human–AI collaborative analytical pipeline for multi-class sentiment and aggression analysis of large-scale social media data (N = 15,064 messages) related to an urban infrastructure project. The proposed framework integrates standard NLP preprocessing, machine learning-based classifiers, temporal aggregation, and controlled large language model (LLM)-assisted classification within a structured analytical workflow that incorporates expert validation and oversight. A stratified manual validation procedure (n = 301) demonstrated substantial inter-annotator agreement (κ = 0.70) and stable multi-class classification accuracy (80%). The results indicate that combining sentiment polarity and aggression detection as complementary linguistic indicators improves sensitivity to shifts in discourse dynamics and enables early identification of emerging social tension. The study demonstrates the potential of human–AI collaborative analytical frameworks for transparent, interpretable, and predictive large-scale social media analysis in decision-support contexts. Full article
(This article belongs to the Special Issue Human–AI Collaboration: Emerging Technologies and Applications)
Show Figures

Graphical abstract

19 pages, 1177 KB  
Review
Imaging Engineering and Artificial Intelligence in Urinary Stone Disease: Low-Dose Computed Tomography, Spectral Technologies, and Predictive Models
by Shota Iijima, Takanobu Utsumi, Rino Ikeda, Naoki Ishitsuka, Takahide Noro, Yuta Suzuki, Yuka Sugizaki, Takatoshi Somoto, Ryo Oka, Takumi Endo, Naoto Kamiya and Hiroyoshi Suzuki
Eng 2026, 7(4), 174; https://doi.org/10.3390/eng7040174 - 11 Apr 2026
Viewed by 366
Abstract
Urinary stone disease is common, recurrent, and increasingly managed through imaging-driven pathways, yet standard-dose CT of the kidneys, ureters, and bladder (CT KUB) raises concerns about cumulative radiation exposure and the limited use of quantitative imaging information for risk stratification. This review synthesizes [...] Read more.
Urinary stone disease is common, recurrent, and increasingly managed through imaging-driven pathways, yet standard-dose CT of the kidneys, ureters, and bladder (CT KUB) raises concerns about cumulative radiation exposure and the limited use of quantitative imaging information for risk stratification. This review synthesizes contemporary evidence on dose-optimized CT, advanced spectral technologies, and artificial intelligence (AI)-enabled analytics that are reshaping diagnosis, treatment selection, and triage. This review summarizes data supporting low-dose and ultra-low-dose CT protocols that preserve diagnostic accuracy while substantially reducing dose, and discusses how dual-energy CT, photon-counting CT, and radiomics facilitate noninvasive stone characterization and extraction of imaging biomarkers beyond size and location. It also reviews AI approaches for automated detection, segmentation, and volumetric quantification across CT, KUB, and ultrasounds, highlighting their potential to standardize stone-burden metrics. It further examines predictive models, including logistic regression, nomograms, and machine learning, for perioperative infectious complications, emergency department admission or intervention, procedure success, and long-term recurrence, and outlines reporting and validation frameworks and implementation considerations, including software as a medical device regulation and human oversight. In contrast to prior reviews that consider imaging and AI separately, this review integrates dose reduction, spectral characterization, and AI-driven analytics within real-world clinical pathways to distinguish established clinical applications from those that remain investigational. Integrating advanced CT and AI outputs into well-validated prediction models embedded in real-world workflows may enable safer imaging, more consistent triage, and more personalized follow-up for urinary stone disease. Full article
Show Figures

Figure 1

22 pages, 3732 KB  
Systematic Review
Mapping Urban Socio-Economic Resilience to Climate Change: A Bibliometric Systematic Review and Thematic Analysis of Global Research (1990–2025)
by Irina Onțel, Luminița Chivu, Sorin Avram and Carmen Gheorghe
Sustainability 2026, 18(8), 3698; https://doi.org/10.3390/su18083698 - 9 Apr 2026
Viewed by 240
Abstract
Urban socio-economic resilience to climate change has emerged as a central research theme as cities increasingly confront interconnected environmental, economic, and social risks. Despite the rapidly expanding body of literature, the conceptual boundaries, thematic evolution, and analytical priorities of this field remain fragmented [...] Read more.
Urban socio-economic resilience to climate change has emerged as a central research theme as cities increasingly confront interconnected environmental, economic, and social risks. Despite the rapidly expanding body of literature, the conceptual boundaries, thematic evolution, and analytical priorities of this field remain fragmented across disciplines, and no prior study has systematically mapped the socio-economic dimension of urban resilience through a combined bibliometric and thematic analysis over a multi-decadal horizon. This study addresses that gap by providing a systematic review of global research on urban socio-economic resilience to climate change, integrating bibliometric and thematic analyses of peer-reviewed publications from 1990 to 2025. Following the PRISMA 2020 guidelines, records were retrieved from the Web of Science Core Collection and subjected to a multi-stage screening procedure that combined automated relevance scoring with mandatory manual validation of the socio-economic dimension, resulting in a final dataset of 5076 publications. The analysis examines conceptual interpretations of socio-economic resilience, dominant climate hazards affecting urban systems, methodological approaches and assessment indicators, adaptation strategies and governance responses, and emerging research gaps. The results reveal a marked acceleration of scientific output after 2015, driven by the Paris Agreement and the IPCC Special Report on Global Warming of 1.5 °C (2018). The bibliometric network analyses identify adaptation, vulnerability, flooding, and sustainability transitions as the core thematic clusters. The findings trace a paradigmatic trajectory from equilibrist recovery frameworks toward transformative, socio-economically grounded resilience models and reveal persistent gaps in the operationalization of governance, equity measurement, and geographic representation. By synthesizing three-and-a-half decades of scholarship, this review clarifies the intellectual structure of the field and proposes four specific post-2026 research pathways that emphasize longitudinal cross-city comparisons, mixed-methods assessments, sector-specific compound hazard analyses, and governance mechanism studies. Full article
(This article belongs to the Section Social Ecology and Sustainability)
Show Figures

Figure 1

20 pages, 1116 KB  
Article
Process-Integrated Optimization and Symbolic Regression for Direct Prediction of CFRP Area in Masonry Wall Strengthening
by Gebrail Bekdaş, Ammar Khalbous, Sinan Melih Nigdeli and Ümit Işıkdağ
Processes 2026, 14(7), 1163; https://doi.org/10.3390/pr14071163 - 3 Apr 2026
Viewed by 321
Abstract
Unreinforced masonry walls exhibit limited resistance to lateral loads and, therefore, frequently require strengthening interventions. Carbon fiber reinforced polymer (CFRP) systems provide an efficient retrofit solution; however, current design procedures defined in structural guidelines require repetitive trial calculations to determine the necessary reinforcement [...] Read more.
Unreinforced masonry walls exhibit limited resistance to lateral loads and, therefore, frequently require strengthening interventions. Carbon fiber reinforced polymer (CFRP) systems provide an efficient retrofit solution; however, current design procedures defined in structural guidelines require repetitive trial calculations to determine the necessary reinforcement amount. This study introduces a hybrid computational process that integrates metaheuristic optimization with symbolic regression to generate direct analytical equations for the estimation of the required CFRP area. First, a comprehensive database containing 1300 optimal strengthening scenarios was generated using the Jaya optimization algorithm under the constraints specified in ACI 440.7R and ACI 530. The resulting dataset was subsequently processed through symbolic regression using the PySR platform to identify explicit mathematical relationships between structural parameters and the optimum CFRP area. Most traditional machine learning approaches operate as black-box predictors. In contrast, the proposed approach generates interpretable closed-form expressions that can be used directly in engineering calculations. Two models were derived from the Pareto-optimal solution set. The first model is a simplified equation emphasizing algebraic simplicity. The second model prioritizes prediction accuracy. The simplified formulation achieved a coefficient of determination of approximately 0.992. The accuracy-focused model achieved a value above 0.997 with very low prediction errors. Validation studies with independent test samples showed that the obtained equations are reliable. The average error for the simplified model is below 4%, and for the high-accuracy model, it is approximately 2%. The results demonstrate that combining the optimization-generated datasets with symbolic regression makes it possible to obtain transparent design equations. These equations eliminate iterative design processes and provide a fast and reliable estimation tool for CFRP strengthening of masonry walls. Full article
(This article belongs to the Special Issue Advanced Functional Materials Design and Computation)
Show Figures

Figure 1

15 pages, 6524 KB  
Article
Fourier Ambiguity Validation for Carrier-Phase GNSS
by Peter J. G. Teunissen
Sensors 2026, 26(7), 2201; https://doi.org/10.3390/s26072201 - 2 Apr 2026
Viewed by 420
Abstract
Carrier-phase ambiguity validation is essential to ensure the reliability of integer ambiguity resolution in high-precision GNSS positioning. Although integer equivariant (IE) estimators provide optimal integer candidates within their class, noise and model limitations may lead to incorrect fixing. Validation procedures are therefore crucial [...] Read more.
Carrier-phase ambiguity validation is essential to ensure the reliability of integer ambiguity resolution in high-precision GNSS positioning. Although integer equivariant (IE) estimators provide optimal integer candidates within their class, noise and model limitations may lead to incorrect fixing. Validation procedures are therefore crucial for safeguarding the transition from float to fixed solutions, particularly in high-precision and safety-critical applications. In this contribution we introduce the concept of Fourier ambiguity validation and show how it is rooted in the principles of integer aperture (IA) estimation and its periodic representation. Unlike classical integer estimators that always return an integer solution, IA estimators introduce adjustable acceptance regions in the float ambiguity domain and fix ambiguities only when sufficient statistical evidence is present. As a result we present a general Fourier representation of IA estimators and provide an analytical description of the probabilistic properties of integer-aperture bootstrapping. We also present a hybrid description and show how the spatial and frequency representations can be mixed so as to do justice to the practical situation when carrier-phase ambiguities have a wide range of varying precision. Full article
(This article belongs to the Special Issue Sensors in 2026)
Show Figures

Figure 1

28 pages, 2925 KB  
Article
Explicit Algebraic Approximations for MTPA, MTPV, and Loss-Minimization Optimal Control of PMSMs
by Minho Bae, Su-Min Kim and Han Ho Choi
Electronics 2026, 15(7), 1440; https://doi.org/10.3390/electronics15071440 - 30 Mar 2026
Viewed by 388
Abstract
This paper presents explicit algebraic methods for approximating optimal dq-axis current references in permanent magnet synchronous motors (PMSMs) under given torque commands. The proposed approach addresses three key optimal control strategies: maximum torque per ampere (MTPA), maximum torque per voltage (MTPV), [...] Read more.
This paper presents explicit algebraic methods for approximating optimal dq-axis current references in permanent magnet synchronous motors (PMSMs) under given torque commands. The proposed approach addresses three key optimal control strategies: maximum torque per ampere (MTPA), maximum torque per voltage (MTPV), and loss-minimization control. For MTPA operation, a closed-form explicit formula is derived to approximate the d-axis current that minimizes copper losses. For MTPV operation, an analytical expression is developed to approximate the optimal current vector, effectively addressing iron losses in the high-speed region. Furthermore, a simplified formulation for loss-minimization control is proposed to enhance overall efficiency by balancing both copper and iron losses. These formulas are computationally efficient and eliminate the need for iterative numerical procedures while maintaining high accuracy. Supplementary expressions are also provided to facilitate practical implementation under current and voltage constraints. The mathematical fidelity and computational efficiency of the proposed formulas are rigorously validated through numerical simulations using representative PMSM models. The results demonstrate that the proposed explicit approximations closely match the true numerical optimal trajectories, offering a practical alternative to complex iterative methods without the need for extensive experimental characterization. Full article
Show Figures

Figure 1

27 pages, 7770 KB  
Article
Structured Data Visualization Instruction in Graduate Education: An Empirical Study of Conceptual and Procedural Development
by Simón Gutiérrez de Ravé, Eduardo Gutiérrez de Ravé and Francisco José Jiménez-Hornero
Educ. Sci. 2026, 16(4), 533; https://doi.org/10.3390/educsci16040533 (registering DOI) - 27 Mar 2026
Viewed by 516
Abstract
Information visualization is a crucial yet often underdeveloped research skill in graduate education. This study examined how practice-based visualization instruction enhances graduate students’ conceptual understanding and procedural competence in scientific graph construction. Forty-two first-year graduate students participated in a ten-week instructional program combining [...] Read more.
Information visualization is a crucial yet often underdeveloped research skill in graduate education. This study examined how practice-based visualization instruction enhances graduate students’ conceptual understanding and procedural competence in scientific graph construction. Forty-two first-year graduate students participated in a ten-week instructional program combining diagnostic assessment, guided exercises, and a complex graph replication task. Conceptual and procedural competence were evaluated using validated analytic rubrics to ensure reliability and depth of analysis. Results showed substantial improvement in students’ ability to select suitable chart types, label axes accurately, and apply coherent color schemes. Consistent with the study’s hypotheses, significant gains were observed in conceptual understanding (H1) and technical execution (H2), and a moderate positive correlation between the two domains (H3) confirmed that stronger conceptual grasp aligned with higher visualization proficiency. Iterative feedback and guided reflection supported the integration of theory and practice. However, challenges in detailed annotation and multivariable coordination persisted. Overall, structured, practice-based visualization training enhanced methodological competence and communication clarity. Embedding such experiential learning within graduate curricula can strengthen visualization literacy and support the development of research independence. Full article
(This article belongs to the Section Higher Education)
Show Figures

Figure A1

16 pages, 805 KB  
Article
Simultaneous LC–MS Profiling of Bioactive Ecdysteroids in Nutrient-Dense Plant Sources and Dietary Supplements
by Velislava Todorova, Stanislava Ivanova, Raina Ardasheva and Kalin Ivanov
Molecules 2026, 31(7), 1090; https://doi.org/10.3390/molecules31071090 - 26 Mar 2026
Viewed by 430
Abstract
Phytoecdysteroids have garnered increasing interest due to their broad biological and pharmacological properties. The present study reports on the development and validation of a reliable liquid chromatography–mass spectrometry method for the detection and quantification of 20-hydroxyecdysone, turkesterone, and ponasterone. The optimized procedure improved [...] Read more.
Phytoecdysteroids have garnered increasing interest due to their broad biological and pharmacological properties. The present study reports on the development and validation of a reliable liquid chromatography–mass spectrometry method for the detection and quantification of 20-hydroxyecdysone, turkesterone, and ponasterone. The optimized procedure improved ionization efficiency and chromatographic resolution through gradient elution using 0.1% formic acid in water and acetonitrile. Data acquisition in selective ion monitoring modes ensured high analytical precision, reproducibility, and sensitivity. The method demonstrated excellent linearity, accuracy, repeatability, and low detection limits, making it suitable for routine phytochemical and quality control applications. Application of the method to extracts from nutrient-rich superfoods, including kaniwa, spinach, quinoa, and asparagus, confirmed these plants as natural sources of phytoecdysteroids. Additionally, thirteen commercially available dietary supplements labeled as containing extracts of Rhaponticum carthamoides, Cyanotis arachnoidea, Ajuga turkestanica, or ecdysteroids were analyzed. Several products standardized to 80–95% ecdysterone contained substantially lower amounts than declared, with measured 20-hydroxyecdysone levels ranging from below the limit of detection to approximately 50 mg per capsule, whereas some non-standardized products exhibited moderate to high levels, reaching up to approximately 105 mg per capsule. Variability in turkesterone content was also observed among products marketed as standardized extracts. The method provides a simple, reliable, and accessible approach for the quantitative analysis of major phytoecdysteroids in complex plant matrices and dietary supplements. Its implementation may support phytochemical research, routine quality control, and anti-doping monitoring of ecdysteroid-containing products. Full article
Show Figures

Figure 1

Back to TopTop