Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,770)

Search Parameters:
Keywords = step change test

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 3530 KB  
Article
Dual-Species Fermentation of a Lycium barbarumPolygonatum cyrtonema Composite Jiaosu Enhanced Antioxidant Activity and Alleviated Alcohol-Induced Liver Injury in Mice
by Shuyuan Yang, Bingcan Liu, Honghui Geng, Zhen Yu, Wenge Xu, Can Hu, An Zhou, Wencheng Zhang and Zeyu Wu
Foods 2026, 15(8), 1435; https://doi.org/10.3390/foods15081435 - 20 Apr 2026
Abstract
Lycium barbarumPolygonatum cyrtonema composite jiaosu (LBPCJ) was prepared by sequential dual-species fermentation and evaluated in a mouse model of alcohol-induced liver injury. Following process optimization, a yeast-first sequential strategy with intermediate pasteurization was selected, comprising an initial Saccharomyces cerevisiae fermentation step, [...] Read more.
Lycium barbarumPolygonatum cyrtonema composite jiaosu (LBPCJ) was prepared by sequential dual-species fermentation and evaluated in a mouse model of alcohol-induced liver injury. Following process optimization, a yeast-first sequential strategy with intermediate pasteurization was selected, comprising an initial Saccharomyces cerevisiae fermentation step, intermediate pasteurization, and a subsequent Lactiplantibacillus plantarum fermentation step. Fermentation reduced pH from 4.68 to 3.51 and increased total acidity from 61.06 to 135.39 g LA/L and total phenolic content from 3.01 to 9.39 mg GAE/mL. In vitro antioxidant-related activities were also higher after fermentation, with DPPH, ABTS, and •OH scavenging rates increasing by 39.90%, 29.78%, and 11.10%, respectively. In mice, LBPCJ administration was associated with lower liver index and serum aminotransferase levels, together with attenuated hepatic histopathological alterations, with the high-dose group (15 mL/kg BW) showing the clearest response. These changes were accompanied by higher hepatic SOD and GSH levels and lower MDA, TNF-α, IL-1β, and IL-6 levels. LBJ and PCJ also improved several measured indicators, while LBPCJ showed changes across multiple endpoints under the tested conditions. Overall, sequential fermentation markedly altered the physicochemical and antioxidant-related properties of LBPCJ, and LBPCJ administration improved multiple indicators related to alcohol-induced liver injury in mice, although the specific constituents and underlying mechanisms remain to be clarified. Full article
Show Figures

Figure 1

22 pages, 876 KB  
Article
Large Autonomous Driving Overtaking Decision and Control System Based on Hierarchical Reinforcement Learning
by Chen-Ning Wang and Xiuhui Tang
Electronics 2026, 15(8), 1711; https://doi.org/10.3390/electronics15081711 - 17 Apr 2026
Viewed by 126
Abstract
To address the bottlenecks of low sample efficiency and poor control accuracy in traditional single-layer reinforcement learning during autonomous driving overtaking, this paper proposes an overtaking decision and control system based on hierarchical reinforcement learning to decouple complex tasks in spatial and temporal [...] Read more.
To address the bottlenecks of low sample efficiency and poor control accuracy in traditional single-layer reinforcement learning during autonomous driving overtaking, this paper proposes an overtaking decision and control system based on hierarchical reinforcement learning to decouple complex tasks in spatial and temporal dimensions. A heterogeneous two-layer architecture is constructed, where the upper layer adopts the Proximal Policy Optimization algorithm to generate macroscopic discrete decisions, while the lower layer employs Twin Delayed Deep Deterministic Policy Gradient combined with Long Short-Term Memory to achieve smooth continuous control of steering and acceleration by perceiving temporal features of dynamic obstacles. A composite reward mechanism, integrating hard safety constraints and soft efficiency incentives, is designed to balance safety, efficiency, and comfort. Experimental results in complex scenarios with multiple interfering vehicles and random lane-changing behaviors demonstrate that the proposed system improves the training convergence speed by approximately 30% within 500,000 steps compared to single-layer algorithms. In tests across varying traffic densities, the system achieves a 98.3% success rate in medium-density scenarios with a collision rate of only 0.6%. In high-density challenges, the success rate remains above 95%, with the collision rate reduced by about 80% compared to baseline models. Furthermore, the lateral control deviation is strictly limited to within 0.2 m, and the longitudinal safety distance remains stable above 5 m. This system provides a robust, high-efficiency paradigm for autonomous overtaking. Full article
34 pages, 1870 KB  
Article
Determining Univariate Equivalency of Additively Manufactured Parts
by Colin M. Lynch, Rene Villalobos, Brenda Leticia Valadez Mesta, Cesar Gomez Guillen, Jorge Mireles and Ryan B. Wicker
J. Manuf. Mater. Process. 2026, 10(4), 134; https://doi.org/10.3390/jmmp10040134 - 17 Apr 2026
Viewed by 106
Abstract
Additive manufacturing (AM) requires process-comparison tools that remain practical when sample generation and testing are costly. We propose a univariate, nonparametric workflow for comparing a candidate AM process to a stable reference process by testing distributional equivalency for a single response variable. The [...] Read more.
Additive manufacturing (AM) requires process-comparison tools that remain practical when sample generation and testing are costly. We propose a univariate, nonparametric workflow for comparing a candidate AM process to a stable reference process by testing distributional equivalency for a single response variable. The method discretizes the reference distribution into empirical percentile-defined bins and combines this representation with a sequential sampling protocol designed to reduce unnecessary sampling when evidence for equivalency or non-equivalency becomes sufficient. Simulation studies were used to evaluate operating characteristics across experimental settings, and a validation case study based on geometric measurements of laser based powder bed fusion plate scans correctly classified a candidate process expected to be equivalent to the reference while identifying a non-equivalent process at the first sampling step. The workflow is most appropriate for low-sample, high-cost, or throughput-constrained settings, and is best viewed as a tool for process comparability, change control, calibration, and requalification support rather than as a standalone replacement for qualification standards. The full workflow is implemented in the open-source AMEquivalency package to support reproducible analysis. Full article
25 pages, 942 KB  
Article
Hybrid Loss-Based Deep Learning Framework Using EfficientNet-B3 for Multi-Class Colorectal Cancer Detection
by Anusha Nallamalla and Chandrakanta Mahanty
AI 2026, 7(4), 143; https://doi.org/10.3390/ai7040143 - 16 Apr 2026
Viewed by 151
Abstract
Diagnosis of colorectal cancer (CRC) primarily relies on histopathological examination of hematoxylin and eosin-stained tissue sections; however, manual interpretation is time-consuming, subjective, and increasingly impractical given the rapid growth of digital pathology data. We introduced a hybrid loss-based learning framework for multi-class colorectal [...] Read more.
Diagnosis of colorectal cancer (CRC) primarily relies on histopathological examination of hematoxylin and eosin-stained tissue sections; however, manual interpretation is time-consuming, subjective, and increasingly impractical given the rapid growth of digital pathology data. We introduced a hybrid loss-based learning framework for multi-class colorectal histopathology image classification that improves class-balanced performance without increasing model complexity. Various EfficientNet versions were checked as the first step to establishing a strong baseline, and EfficientNet-B3 was chosen based on validation Matthews Correlation Coefficient (MCC). Extending this backbone, we propose a hybrid loss function that mixes weighted cross-entropy and focal loss to achieve the combined effect of dealing with the global class imbalance while also focusing on hard-to-classify samples. The results of experiments on a large-scale colorectal histopathology dataset show that the Hybrid-B3 model introduced significantly improves the baseline settings. Hybrid-B3 registers a test accuracy of 99.83%, a very high class-balanced performance with a balanced accuracy and G-Mean of 99.85%. The changes are verified and non-random by the statistical validation using bootstrap confidence intervals and paired significance tests. The offered solution emphasizes the efficiency of loss-function optimization solely to provide improvements in robustness and reliability in computational pathology and, correspondingly, yields a practical and scalable solution for colorectal cancer diagnostic support in the real ‍‌world. Full article
(This article belongs to the Special Issue AI in Bio and Healthcare Informatics)
31 pages, 7153 KB  
Article
Balancing Accuracy and Efficiency in the Temporal Resampling of Met-Ocean Data
by Sara Ramos-Marin and C. Guedes Soares
Oceans 2026, 7(2), 35; https://doi.org/10.3390/oceans7020035 - 16 Apr 2026
Viewed by 240
Abstract
Harmonising heterogeneous met-ocean time series to a common temporal resolution is a prerequisite for integrated marine renewable energy assessments. Such datasets often differ in their sampling frequency, statistical distribution, and non-stationarity, complicating joint analysis. This study presents a practical multi-criteria framework for selecting [...] Read more.
Harmonising heterogeneous met-ocean time series to a common temporal resolution is a prerequisite for integrated marine renewable energy assessments. Such datasets often differ in their sampling frequency, statistical distribution, and non-stationarity, complicating joint analysis. This study presents a practical multi-criteria framework for selecting temporal interpolation strategies for met-ocean datasets, explicitly balancing prediction accuracy and computational efficiency. Six environmental variables relevant to offshore renewable energy—wind speed, significant wave height, energy period, peak period, global horizontal irradiance, and upper-ocean thermal gradients—are analysed using ten-year reanalysis datasets for the Madeira Archipelago. Six commonly used deterministic time-domain interpolation methods are evaluated within a unified validation framework combining training–test splits, k-fold cross-validation, and Monte Carlo resampling. Their performances are quantified using the relative root mean square error and computational time, integrated through a composite performance score. The results show that makima interpolation provides the most consistent compromise between accuracy and efficiency for most variables in dense, regularly sampled met-ocean datasets, while spline-based approaches perform better for highly skewed solar irradiance. Preprocessing steps, such as detrending and distribution normalisation, yield only marginal improvements for dense, regularly sampled datasets, and method rankings remain stable under moderate changes in accuracy–speed weightings. Rather than proposing a universal interpolator, this work delivers a reproducible decision-support workflow for temporal resampling of multi-variable met-ocean datasets, supporting early-stage marine renewable energy assessments. Full article
(This article belongs to the Special Issue Offshore Renewable Energy and Related Environmental Science)
Show Figures

Figure 1

21 pages, 2288 KB  
Article
Filling the Gap: Establishing a Statewide Tick and Tick-Borne Pathogen Surveillance Program
by Kyndall C. Dye-Braumuller, Lídia Gual-Gonzalez, Emily Owens Pickle, Christopher Lee, Madeleine M. Meyer-Torelli, Chris L Evans, Jennifer G. Chandler, Rebecca T. Trout Fryxell and Melissa S. Nolan
Insects 2026, 17(4), 414; https://doi.org/10.3390/insects17040414 - 12 Apr 2026
Viewed by 441
Abstract
Individuals in the southeastern United States of America (USA) have an increasing risk of contracting a tick-borne disease. Land use changes, changing climate, and redistribution of both ticks and their hosts make systematic tick and tick-borne pathogen investigation crucial for public health protection. [...] Read more.
Individuals in the southeastern United States of America (USA) have an increasing risk of contracting a tick-borne disease. Land use changes, changing climate, and redistribution of both ticks and their hosts make systematic tick and tick-borne pathogen investigation crucial for public health protection. Prior to 2020, South Carolina had limited data on tick species distribution and tick infection rates. In this work, we describe establishment of a sustainable tick and tick-borne pathogen collaborative network for South Carolina. A major determinant of program success was sharing work effort between the University of South Carolina, the South Carolina Department of Public Health, and key partners including state park employees, local veterinarians, students, and volunteers. The program collected questing ticks from public lands and host-attached ticks from animal shelters. Amblyomma americanum was the most commonly collected tick, with highest density in South Carolina’s southern coastal region. A greater tick species diversity was seen in animal shelter collected versus questing ticks. Pathogen testing results yielded a high presence of Rickettsia amblyommatis among Am. americanum ticks with several other Rickettsia spp. detected including Rickettsia parkeri, Candidatus R. andeanae, R. montanensis, and R. asembonensis. Additional Rickettsiales detected included multiple Ehrlichia and Anaplasma species, with higher presence in the state’s northern region. Borrelia burgdorferi sensu stricto was detected in one questing Ixodes keiransi from the southern coastal region. The current report presents the initial steps for pathogen and tick species surveillance in South Carolina, providing successes and pitfalls as a model for other states and regions to establish similar efforts to improve national tick surveillance. Full article
(This article belongs to the Section Medical and Livestock Entomology)
Show Figures

Figure 1

28 pages, 2251 KB  
Article
Hierarchical Continuous Monitoring and Resource Reallocation Under Resistance to Change: A Decision-Making Framework Balancing Skill Constraints and Managerial Capacity
by Fotios Panagiotopoulos and Vassilios Chatzis
Algorithms 2026, 19(4), 293; https://doi.org/10.3390/a19040293 - 9 Apr 2026
Viewed by 203
Abstract
Organizational change is a complex process often accompanied by intense human reactions and increased uncertainty. Resistance to change (RtC) can cause critical performance declines during the organizational change period, which can delay implementation. The evolution of information systems and digital infrastructures provides immediate [...] Read more.
Organizational change is a complex process often accompanied by intense human reactions and increased uncertainty. Resistance to change (RtC) can cause critical performance declines during the organizational change period, which can delay implementation. The evolution of information systems and digital infrastructures provides immediate access to operational data and analytical tools, making it possible to continuously monitor performance and timely adjust decisions during change. Although recent approaches attempt to minimize these impacts through continuous monitoring and resource reallocation, they typically view human resource allocation as a single-level problem. In hierarchical structures where work and decision-making are distributed across levels, RtC can increase backlogs, place an excessive amount of work on managers, and result in operational issues or the failure of the change. From an algorithmic perspective, the proposed method formulates a hierarchical dynamic optimization problem with two coupled assignment layers, in which the operational output of Level 1 dynamically determines the workload processed at Level 2. Both assignment problems are solved at each time step using the Hungarian algorithm, while RtC is modelled as a time-dependent stochastic process aligned with a reference change curve, allowing employee and managerial performance to be updated dynamically over the planning horizon. In contrast to static Classical Change Management Model (CCMM), large-scale experimental results demonstrate that the new approach increases total processed workload by approximately 20%, while at the peak of resistance, the improvement reaches 56.8%. At the same time, it substantially reduces backlog accumulation, maintaining very low backlog levels (18 versus 16,424 units) within the tested setting. Finally, by applying a 50% reallocation threshold, the organization maintains 98.5% of maximum performance while avoiding 45% of the reallocations. Overall, the proposed method provides a dynamic optimization framework that combines hierarchical organizational modeling with stochastic performance updates across organizational levels. Full article
(This article belongs to the Special Issue Recent Advances in Numerical Algorithms and Their Applications)
Show Figures

Figure 1

25 pages, 6996 KB  
Article
Uncertainty and Sensitivity Analysis of Input Parameters in the CANDLE Module: A Morris–Sobol–LHS–Iman–Conover Framework
by Fenghui Yang, Wanhong Wang, Rubing Ma and Xiaoming Yang
J. Nucl. Eng. 2026, 7(2), 27; https://doi.org/10.3390/jne7020027 - 6 Apr 2026
Viewed by 326
Abstract
In this study, an uncertainty quantification (UQ) and sensitivity analysis (SA) workflow was developed for the input parameters of the CANDLE module, which is currently being tested and verified for calculating the downward relocation and solidification of molten core material. The workflow consists [...] Read more.
In this study, an uncertainty quantification (UQ) and sensitivity analysis (SA) workflow was developed for the input parameters of the CANDLE module, which is currently being tested and verified for calculating the downward relocation and solidification of molten core material. The workflow consists of three steps: (i) Morris screening to reduce the input set, (ii) Sobol variance decomposition on the screened subset to compute Sobol sensitivity indices, and (iii) uncertainty propagation using a 2 × 2 design that combines two sampling schemes (MC and LHS) with two dependence settings (independent and correlated inputs). The four cases considered were independent MC, correlated MC, independent LHS, and correlated LHS–Iman–Conover (LHS-IC). We considered 16 input parameters and three output figures of merit (FOMs) and compared the four cases in terms of propagated uncertainty and Shapley-based importance rankings, thereby distinguishing the effects of the sampling scheme, the imposed input dependence, and their interaction. The results show that the molten mass of the current material in the source node is the dominant factor governing the drained melt mass and the remaining melt mass in the receiving node, whereas the cold-wall surface temperature has a significant effect on the mass of molten material that solidifies in the receiving node. The mass of molten material that remains available in the receiving node is mainly governed by the coupled effects of the molten mass of the current material at the source node, the length of the receiving node, and the velocity limit. Under the non-uniform input-parameter distributions adopted in this study, LHS broadened the range of the outputs. After input correlations were introduced, the output distributions changed slightly. This study improves the understanding of input parameter sensitivities and uncertainty propagation in the CANDLE module. It also demonstrates the practical use of LHS-IC for module-level UQ/SA with correlated inputs, providing guidance for subsequent model improvements and parameter tuning. Full article
Show Figures

Figure 1

26 pages, 8956 KB  
Article
Experiments and Simulations on the Factors Governing Fast Transient Responses in Cavity Discharge
by Kang Zuo, Chuankai Liu and Jiajun Wang
Appl. Sci. 2026, 16(7), 3535; https://doi.org/10.3390/app16073535 - 4 Apr 2026
Viewed by 206
Abstract
Experimental investigations and datasets in the open literature remain scarce for the fast transient response of air systems induced by sudden internal structural failures, hindering rigorous experimental validation of the governing trends associated with multiple influencing factors. To address this gap, we establish [...] Read more.
Experimental investigations and datasets in the open literature remain scarce for the fast transient response of air systems induced by sudden internal structural failures, hindering rigorous experimental validation of the governing trends associated with multiple influencing factors. To address this gap, we establish a fast transient air-system test platform and develop a step boundary simulation device based on mechanical energy storage, enabling rapid and repeatable boundary transients. The experiments demonstrate that the minimum boundary-change time is less than 6 ms, satisfying the simulation requirement for boundary transients associated with typical sudden structural failures (≤10 ms). Guided by a dimensionless analysis, we conduct fast transient cavity-venting experiments under varying outlet areas, cavity geometric parameters, and initial pressure ratios, thereby obtaining the transient response data of the cavity pressure. In parallel, we simulate the test process using a three-dimensional numerical approach validated against the experiments; by combining experimental and numerical results, we systematically analyze the effects of key factors on the fast transient response during cavity venting and elucidate the underlying mechanisms. This paper provides experimentally validated data and a reliable experimental methodology for studying fast transient response processes in air systems, and it supports the passive safety design of aero-engines. Full article
(This article belongs to the Special Issue Advances in Fluid Mechanics Analysis)
Show Figures

Figure 1

29 pages, 2752 KB  
Article
Policy Shocks and Public Attention to Digital Tax in Greece: Event-Study and Nowcasting with Google Trends Time Series
by Stefanos Balaskas
Account. Audit. 2026, 2(2), 6; https://doi.org/10.3390/accountaudit2020006 - 2 Apr 2026
Viewed by 385
Abstract
Digital tax reforms are implemented through staged, publicly announced milestones, yet policymakers rarely have timely indicators of whether these signals mobilize information-seeking and whether such demand can be anticipated for operational planning. We analyze monthly Google Trends series for Greece’s myDATA/e-invoicing rollout (2016–present) [...] Read more.
Digital tax reforms are implemented through staged, publicly announced milestones, yet policymakers rarely have timely indicators of whether these signals mobilize information-seeking and whether such demand can be anticipated for operational planning. We analyze monthly Google Trends series for Greece’s myDATA/e-invoicing rollout (2016–present) using preregistered event study models that separate step changes from post-event trend shifts with HAC-robust inference, and we evaluate 1–3-month predictive performance via rolling-origin cross-validation against a seasonal-naïve benchmark. Search-based attention shifts appeared most clearly in application-related queries: invoicing app terms spike around visible rollout phases (≈+34 to +38 index points over six months) and decline around VAT–myDATA alignment (≈−34 to −43). Ecosystem attention (the “Electronic invoicing” topic) exhibits large, opposite-signed movements (≈−53 around public-sector expansion; ≈+46 around VAT alignment), whereas platform terms show smaller and less regular responses; a back-office milestone produces no detectable change. In out-of-sample tests, event-aware regressions improve short-horizon accuracy for platform terms (≈40–50% MAE reduction at one month; ≈18–32% at two to three months), with series- and horizon-dependent results elsewhere. Overall, the evidence supports using search activity as an intermediate planning signal—informative about when and where guidance demand concentrates but not evidence of compliance. Full article
Show Figures

Figure 1

20 pages, 1205 KB  
Article
An Exploratory Protocol for Sustainability-Oriented Cross-Index Assessment of National Climate Policy Effectiveness
by Olena Matukhno, Valentyna Stanytsina, Olena Dobrovolska and Volodymyr Artemchuk
Sustainability 2026, 18(7), 3444; https://doi.org/10.3390/su18073444 - 1 Apr 2026
Viewed by 423
Abstract
Effective climate policy is central to sustainability transitions and to monitoring progress toward sustainable development, yet national climate policy ratings often differ in scope, indicator design, time coverage, and scoring logic, producing inconsistent country assessments. This creates a need for transparent tools that [...] Read more.
Effective climate policy is central to sustainability transitions and to monitoring progress toward sustainable development, yet national climate policy ratings often differ in scope, indicator design, time coverage, and scoring logic, producing inconsistent country assessments. This creates a need for transparent tools that can compare, interpret, and contextualize existing indices rather than rely on any single metric. This paper develops an exploratory protocol for sustainability-oriented cross-index assessment of national climate policy effectiveness. We combine a structured comparative analysis and a SWOT-informed diagnostic synthesis of four representative approaches—the Climate Change Performance Index (CCPI), Climate Action Tracker (CAT), the Climate Laws, Institutions, and Measures Index (CLIMI), and the Climate Policy Measure Index (CPMI)—with a pilot inter-index concordance test using rank-based correlation analysis for a small country sample and a common reference year (2012). The pilot is intended as an illustrative methodological example rather than a generalizable statistical test. The results indicate strong alignment among broad, composite approaches (CCPI, CAT, CLIMI), while an instrument-focused metric (CPMI, centered on carbon pricing and fiscal signals) shows weaker consistency with outcome- and governance-oriented ratings. Building on these insights, we compile an integrated indicator set that links outcomes (GHG levels and trends), structural drivers (energy mix, efficiency), policy instruments (pricing, regulation, subsidies), governance capacity (legal and institutional strength), and enabling conditions (finance, public engagement, international cooperation). We also specify the operational steps of the proposed workflow, including index selection, temporal harmonization, ordinal encoding, concordance analysis, discrepancy diagnosis, indicator mapping, and provisional normalization, weighting, aggregation, and validation rules for future composite implementation. The protocol should therefore be understood as a sustainability-oriented decision support workflow for interpreting agreements and disagreements across existing indices and for supporting more balanced evaluation of low-carbon transitions; a fully aggregated composite index with large-sample validation remains a task for future research. Full article
(This article belongs to the Section Development Goals towards Sustainability)
Show Figures

Figure 1

15 pages, 619 KB  
Perspective
Unconstrained Segmental Biomechanics: A Conceptual Framework for Gait Initiation and Locomotor Transitions
by Arianna Fogliata, Lorenzo Cantoni, Alessio Gambetta, Antinea Ambretti and Stefano Tardini
Biomechanics 2026, 6(2), 33; https://doi.org/10.3390/biomechanics6020033 - 1 Apr 2026
Viewed by 302
Abstract
Background/Objectives: Traditional biomechanical models describe human locomotion as an articulated chain of rigid segments with constrained degrees of freedom, primarily focusing on kinematic descriptions of movement. While this approach facilitates modelling and teaching, it may limit the representation of internal force transmission [...] Read more.
Background/Objectives: Traditional biomechanical models describe human locomotion as an articulated chain of rigid segments with constrained degrees of freedom, primarily focusing on kinematic descriptions of movement. While this approach facilitates modelling and teaching, it may limit the representation of internal force transmission and dynamic interactions, particularly during transitional phases such as gait initiation. The objective of this article is to propose a conceptual framework, Unconstrained Segmental Biomechanics (USB), to reinterpret locomotor mechanics beyond rigid joint assumptions. Methods: An exploratory analysis of recent PubMed-indexed publications (2024) and commonly adopted educational references in sport science institutions was conducted to examine how locomotion is conceptually represented and to identify possible models analogous to the framework. The aim was to situate the framework within current modelling approaches rather than to provide a systematic literature evaluation. Results: The exploratory analysis provided an exploratory contextual impression that kinematic representations were more readily identifiable than conceptually analogous models explicitly addressing dynamic intersegmental force transmission. USB is presented as a conceptual framework generating testable biomechanical hypotheses concerning the temporal organisation of intersegmental force transmission during locomotor transitions, including the expectation that during gait initiation gluteus maximus activation precedes observable segmental displacement, that early CoP/GRF changes precede the visible step, and that trunk activation actively contributes to intersegmental force regulation during the transition. Conclusions: USB offers a conceptual framework that enriches the interpretation of gait initiation and locomotor transitions. Future empirical investigations will be necessary to test the biomechanical hypotheses generated by this framework and to evaluate its potential contribution to biomechanics research, education, and applied movement sciences. Full article
Show Figures

Figure 1

22 pages, 1911 KB  
Article
A Two-Step Framework for Mapping, Classification, and Area Estimation of Stand- and Non-Stand-Replacing Forest Disturbances
by Isabel Aulló-Maestro, Saverio Francini, Gherardo Chirici, Cristina Gómez, Icíar Alberdi, Isabel Cañellas, Francesco Parisi and Fernando Montes
Remote Sens. 2026, 18(7), 1038; https://doi.org/10.3390/rs18071038 - 30 Mar 2026
Viewed by 500
Abstract
In recent decades, forest disturbances have increased in both frequency and intensity, driven by global warming and urbanization. Remote sensing, together with forest disturbance algorithms, offers broad opportunities for forest disturbance monitoring due to its high temporal and spatial resolution. However, operational methods [...] Read more.
In recent decades, forest disturbances have increased in both frequency and intensity, driven by global warming and urbanization. Remote sensing, together with forest disturbance algorithms, offers broad opportunities for forest disturbance monitoring due to its high temporal and spatial resolution. However, operational methods capable of predicting and classifying disturbances while providing official area estimates suitable for national statistics remain scarce. The Three Indices Three Dimensions (3I3D) algorithm has proven effective in identifying forest changes and providing area estimates in Mediterranean ecosystems using Sentinel-2 imagery. Yet, while suitable for change detection, it does not distinguish among disturbance types. Here, we propose a two-step framework for forest disturbance detection and classification, tested in inland Spain for 2018. First, a binary forest change map is produced through an enhanced version of the 3I3D approach. This step incorporates Receiver Operating Characteristic (ROC) analysis to calibrate the algorithm through data-driven threshold selection, allowing adaptation to specific regional conditions. Second, detected changes are classified into four disturbance types: wildfire, clear-cut, thinning, and non-stand replacing disturbance, using Sentinel-2 spectral bands, 3I3D-derived metrics, and geometric descriptors of disturbance patches. Three machine-learning classifiers were compared: Support Vector Machine, Random Forest, and Neural Network. The detection step reached an overall accuracy of 82%, estimating that 1.43% of Spanish forests (264,900 ha) were disturbed in 2018. In the classification step, Random Forest achieved the best performance, with an overall accuracy of 72%. Of the detected disturbed area, 69% corresponded to non-stand replacing disturbances, while the remaining area was classified as thinnings (19%), wildfires (26%), and clear-cuts (55%). By integrating freely available Sentinel-2 imagery, remote sensing algorithms, and photo-interpreted reference datasets, this study provides a scalable and operational approach capable of producing annual disturbance maps that combine both detection and classification of high- and low-intensity disturbances, supporting official national-scale estimates of forest disturbance areas. Full article
Show Figures

Figure 1

28 pages, 706 KB  
Article
AI Innovation and Bank Performance: Evidence from Patent Activity of Large U.S. Commercial Banks
by Yinan Ni, John Nyhoff, Mark Napier and David Townsend
J. Risk Financial Manag. 2026, 19(4), 247; https://doi.org/10.3390/jrfm19040247 - 30 Mar 2026
Viewed by 579
Abstract
This paper examines the relationship between artificial intelligence (AI) innovation and bank performance, the organizational channels through which these relationships operate, and the role of firm-wide adoption in shaping outcomes. Using patent-based measures of AI innovation for 31 large U.S. commercial banks from [...] Read more.
This paper examines the relationship between artificial intelligence (AI) innovation and bank performance, the organizational channels through which these relationships operate, and the role of firm-wide adoption in shaping outcomes. Using patent-based measures of AI innovation for 31 large U.S. commercial banks from 2015 to 2024 based on the Federal Reserve’s Large Bank classification and employing panel regressions with bank and year fixed effects, we find that AI innovation is associated with improved asset quality but higher operating costs and lower profitability in the short run. Our two-step mediation analysis implies that AI innovation induces organizational changes through diminishing employee scale and branch networks, which mitigates management efficiency and profitability. Importantly, firm-wide AI adoption mitigates the adverse association between AI innovation and both management and profitability prior to adoption, suggesting that the realization of AI’s benefits requires organizational adaptation and coordinated deployment. Dynamic tests further support the productivity “J-curve” of AI innovation. Our findings suggest that bank managers should align AI investment with organizational restructuring and coordinated deployment, while regulators should account for short-term adjustment costs when evaluating the performance implications of AI adoption. Full article
Show Figures

Figure 1

27 pages, 4264 KB  
Article
A Fast Integral Terminal Sliding Mode Buck Converter with a Fixed-Time Observer for Solar-Powered Livestock Smart Collars
by Shiming Zhang, Haochen Ouyang, Shengqiang Shi, Guichang Fang, Zhen Wang, Xinnan Du and Boyan Huang
Agriculture 2026, 16(7), 746; https://doi.org/10.3390/agriculture16070746 - 27 Mar 2026
Viewed by 452
Abstract
Fully maintenance-free smart collars for range cattle, sheep and deer must survive years of uncontrolled grazing under highly variable shade and motion conditions. This paper presents an ultra-low-power buck converter governed by a fast integral terminal sliding mode controller (FITSMC) with a fixed-time [...] Read more.
Fully maintenance-free smart collars for range cattle, sheep and deer must survive years of uncontrolled grazing under highly variable shade and motion conditions. This paper presents an ultra-low-power buck converter governed by a fast integral terminal sliding mode controller (FITSMC) with a fixed-time observer. A new reaching law retains the initial sliding manifold and a negative-power term maintains the constant switching gain to preserve robustness near the surface while attenuating chattering without widening the bandwidth. The fixed-time observer estimates the irradiance and load changes and provides a feed-forward correction, tightening the output regulation regardless of initial conditions. Load step tests with moderate resistance swings showed the proposed method recovers noticeably faster and exhibits slightly lower overshoot than a recent method based on a two-phase power reaching law, while visible inductor current spikes are also suppressed. Simulations under daily grazing profiles confirmed tight output regulation adequate for microwatt data logging and periodic long-range (LoRa) bursts. The sleep mode quiescent current remained in the 9 microamps range, eliminating the need for manual recharge across multi-season field deployments. By integrating robust power electronics with collar-grade solar harvesting, the circuit offers a truly maintenance-free energy path for untethered livestock wearables and supports sustainable precision agriculture. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

Back to TopTop