Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (921)

Search Parameters:
Keywords = separated calibration

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 2859 KiB  
Article
Feature-Based Normality Models for Anomaly Detection
by Hui Yie Teh, Kevin I-Kai Wang and Andreas W. Kempa-Liehr
Sensors 2025, 25(15), 4757; https://doi.org/10.3390/s25154757 (registering DOI) - 1 Aug 2025
Abstract
Detecting previously unseen anomalies in sensor data is a challenging problem for artificial intelligence when sensor-specific and deployment-specific characteristics of the time series need to be learned from a short calibration period. From the application point of view, this challenge becomes increasingly important [...] Read more.
Detecting previously unseen anomalies in sensor data is a challenging problem for artificial intelligence when sensor-specific and deployment-specific characteristics of the time series need to be learned from a short calibration period. From the application point of view, this challenge becomes increasingly important because many applications are gravitating towards utilising low-cost sensors for Internet of Things deployments. While these sensors offer cost-effectiveness and customisation, their data quality does not match that of their high-end counterparts. To improve sensor data quality while addressing the challenges of anomaly detection in Internet of Things applications, we present an anomaly detection framework that learns a normality model of sensor data. The framework models the typical behaviour of individual sensors, which is crucial for the reliable detection of sensor data anomalies, especially when dealing with sensors observing significantly different signal characteristics. Our framework learns sensor-specific normality models from a small set of anomaly-free training data while employing an unsupervised feature engineering approach to select statistically significant features. The selected features are subsequently used to train a Local Outlier Factor anomaly detection model, which adaptively determines the boundary separating normal data from anomalies. The proposed anomaly detection framework is evaluated on three real-world public environmental monitoring datasets with heterogeneous sensor readings. The sensor-specific normality models are learned from extremely short calibration periods (as short as the first 3 days or 10% of the total recorded data) and outperform four other state-of-the-art anomaly detection approaches with respect to F1-score (between 5.4% and 9.3% better) and Matthews correlation coefficient (between 4.0% and 7.6% better). Full article
(This article belongs to the Special Issue Innovative Approaches to Cybersecurity for IoT and Wireless Networks)
Show Figures

Figure 1

27 pages, 2327 KiB  
Article
Experimental Study of Ambient Temperature Influence on Dimensional Measurement Using an Articulated Arm Coordinate Measuring Machine
by Vendula Samelova, Jana Pekarova, Frantisek Bradac, Jan Vetiska, Matej Samel and Robert Jankovych
Metrology 2025, 5(3), 45; https://doi.org/10.3390/metrology5030045 (registering DOI) - 1 Aug 2025
Abstract
Articulated arm coordinate measuring machines are designed for in situ use directly in manufacturing environments, enabling efficient dimensional control outside of climate-controlled laboratories. This study investigates the influence of ambient temperature variation on the accuracy of length measurements performed with the Hexagon Absolute [...] Read more.
Articulated arm coordinate measuring machines are designed for in situ use directly in manufacturing environments, enabling efficient dimensional control outside of climate-controlled laboratories. This study investigates the influence of ambient temperature variation on the accuracy of length measurements performed with the Hexagon Absolute Arm 8312. The experiment was carried out in a laboratory setting simulating typical shop floor conditions through controlled temperature changes in the range of approximately 20–31 °C. A calibrated steel gauge block was used as a reference standard, allowing separation of the influence of the measuring system from that of the measured object. The results showed that the gauge block length changed in line with the expected thermal expansion, while the articulated arm coordinate measuring machine exhibited only a minor residual thermal drift and stable performance. The experiment also revealed a constant measurement offset of approximately 22 µm, likely due to calibration deviation. As part of the study, an uncertainty budget was developed, taking into account all relevant sources of influence and enabling a more realistic estimation of accuracy under operational conditions. The study confirms that modern carbon composite articulated arm coordinate measuring machines with integrated compensation can maintain stable measurement behavior even under fluctuating temperatures in controlled environments. Full article
Show Figures

Figure 1

25 pages, 21958 KiB  
Article
ESL-YOLO: Edge-Aware Side-Scan Sonar Object Detection with Adaptive Quality Assessment
by Zhanshuo Zhang, Changgeng Shuai, Chengren Yuan, Buyun Li, Jianguo Ma and Xiaodong Shang
J. Mar. Sci. Eng. 2025, 13(8), 1477; https://doi.org/10.3390/jmse13081477 - 31 Jul 2025
Viewed by 12
Abstract
Focusing on the problem of insufficient detection accuracy caused by blurred target boundaries, variable scales, and severe noise interference in side-scan sonar images, this paper proposes a high-precision detection network named ESL-YOLO, which integrates edge perception and adaptive quality assessment. Firstly, an Edge [...] Read more.
Focusing on the problem of insufficient detection accuracy caused by blurred target boundaries, variable scales, and severe noise interference in side-scan sonar images, this paper proposes a high-precision detection network named ESL-YOLO, which integrates edge perception and adaptive quality assessment. Firstly, an Edge Fusion Module (EFM) is designed, which integrates the Sobel operator into depthwise separable convolution. Through a dual-branch structure, it realizes effective fusion of edge features and spatial features, significantly enhancing the ability to recognize targets with blurred boundaries. Secondly, a Self-Calibrated Dual Attention (SCDA) Module is constructed. By means of feature cross-calibration and multi-scale channel attention fusion mechanisms, it achieves adaptive fusion of shallow details and deep-rooted semantic content, improving the detection accuracy for small-sized targets and targets with elaborate shapes. Finally, a Location Quality Estimator (LQE) is introduced, which quantifies localization quality using the statistical characteristics of bounding box distribution, effectively reducing false detections and missed detections. Experiments on the SIMD dataset show that the mAP@0.5 of ESL-YOLO reaches 84.65%. The precision and recall rate reach 87.67% and 75.63%, respectively. Generalization experiments on additional sonar datasets further validate the effectiveness of the proposed method across different data distributions and target types, providing an effective technical solution for side-scan sonar image target detection. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

13 pages, 1486 KiB  
Article
Evaluation of Miscible Gas Injection Strategies for Enhanced Oil Recovery in High-Salinity Reservoirs
by Mohamed Metwally and Emmanuel Gyimah
Processes 2025, 13(8), 2429; https://doi.org/10.3390/pr13082429 - 31 Jul 2025
Viewed by 39
Abstract
This study presents a comprehensive evaluation of miscible gas injection (MGI) strategies for enhanced oil recovery (EOR) in high-salinity reservoirs, with a focus on the Raleigh Oil Field. Using a calibrated Equation of State (EOS) model in CMG WinProp™, eight gas injection scenarios [...] Read more.
This study presents a comprehensive evaluation of miscible gas injection (MGI) strategies for enhanced oil recovery (EOR) in high-salinity reservoirs, with a focus on the Raleigh Oil Field. Using a calibrated Equation of State (EOS) model in CMG WinProp™, eight gas injection scenarios were simulated to assess phase behavior, miscibility, and swelling factors. The results indicate that carbon dioxide (CO2) and enriched separator gas offer the most technically and economically viable options, with CO2 demonstrating superior swelling performance and lower miscibility pressure requirements. The findings underscore the potential of CO2-EOR as a sustainable and effective recovery method in pressure-depleted, high-salinity environments. Full article
(This article belongs to the Special Issue Recent Developments in Enhanced Oil Recovery (EOR) Processes)
Show Figures

Figure 1

33 pages, 4670 KiB  
Article
Universal Prediction of CO2 Adsorption on Zeolites Using Machine Learning: A Comparative Analysis with Langmuir Isotherm Models
by Emrah Kirtil
ChemEngineering 2025, 9(4), 80; https://doi.org/10.3390/chemengineering9040080 - 28 Jul 2025
Viewed by 154
Abstract
The global atmospheric concentration of carbon dioxide (CO2) has exceeded 420 ppm. Adsorption-based carbon capture technologies, offer energy-efficient, sustainable solutions. Relying on classical adsorption models like Langmuir to predict CO2 uptake presents limitations due to the need for case-specific parameter [...] Read more.
The global atmospheric concentration of carbon dioxide (CO2) has exceeded 420 ppm. Adsorption-based carbon capture technologies, offer energy-efficient, sustainable solutions. Relying on classical adsorption models like Langmuir to predict CO2 uptake presents limitations due to the need for case-specific parameter fitting. To address this, the present study introduces a universal machine learning (ML) framework using multiple algorithms—Generalized Linear Model (GLM), Feed-forward Multilayer Perceptron (DL), Decision Tree (DT), Random Forest (RF), Support Vector Machine (SVM), and Gradient Boosted Trees (GBT)—to reliably predict CO2 adsorption capacities across diverse zeolite structures and conditions. By compiling over 5700 experimentally measured adsorption data points from 71 independent studies, this approach systematically incorporates critical factors including pore size, Si/Al ratio, cation type, temperature, and pressure. Rigorous Cross-Validation confirmed superior performance of the GBT model (R2 = 0.936, RMSE = 0.806 mmol/g), outperforming other ML models and providing comparable performance with classical Langmuir model predictions without separate parameter calibration. Feature importance analysis identified pressure, Si/Al ratio, and cation type as dominant influences on adsorption performance. Overall, this ML-driven methodology demonstrates substantial promise for accelerating material discovery, optimization, and practical deployment of zeolite-based CO2 capture technologies. Full article
Show Figures

Figure 1

21 pages, 1844 KiB  
Article
Fast, Simple and Accurate Method for Simultaneous Determination of α-Lipoic Acid and Selected Thiols in Human Saliva by Capillary Electrophoresis with UV Detection and pH-Mediated Sample Stacking
by Urszula Sudomir, Justyna Piechocka, Rafał Głowacki and Paweł Kubalczyk
Molecules 2025, 30(15), 3129; https://doi.org/10.3390/molecules30153129 - 25 Jul 2025
Viewed by 245
Abstract
This report presents the first method for simultaneous determination of the 2-S-lepidinium derivatives of total α-lipoic acid (LA), homocysteine (Hcy), cysteinylglycine (CysGly), and cysteine (Cys) in human saliva, using capillary electrophoresis with pH-mediated sample stacking and ultraviolet detection (CE-UV) at 355 [...] Read more.
This report presents the first method for simultaneous determination of the 2-S-lepidinium derivatives of total α-lipoic acid (LA), homocysteine (Hcy), cysteinylglycine (CysGly), and cysteine (Cys) in human saliva, using capillary electrophoresis with pH-mediated sample stacking and ultraviolet detection (CE-UV) at 355 nm. Electrophoretic separation is carried out at 20 kV and 25 °C using a standard fused silica capillary (effective length 91.5 cm, inner diameter 75 µm). The background electrolyte consists of 0.5 mol/L lithium acetate buffer, adjusted to pH 3.5 with 0.5 mol/L acetic acid. The limit of quantification was determined to be 1 µmol/L for LA and 0.17 µmol/L for Hcy, 0.11 µmol/L for CysGly, and 0.10 µmol/L for Cys in saliva samples. Calibration curves demonstrated linearity over the concentration range of 3 to 30 µmol/L for all analytes. Method precision did not exceed 4.7%, and accuracy ranged from 87.9% to 114.0%. The developed method was successfully applied to saliva samples from eleven apparently healthy volunteers to determine the content of LA, Hcy, CysGly, and Cys. The Hcy, CysGly, and Cys concentrations ranged from 0.55 to 13.76 µmol/L, 0.89 to 9.29 µmol/L, and 1.73 to 12.99 µmol/L, respectively. No LA-derived peaks were detected in the native saliva samples. Full article
(This article belongs to the Section Analytical Chemistry)
Show Figures

Graphical abstract

26 pages, 55836 KiB  
Article
Experimental Acoustic Investigation of Rotor Noise Directivity and Decay in Multiple Configurations
by Giovanni Fasulo, Giosuè Longobardo, Fabrizio De Gregorio and Mattia Barbarino
Aerospace 2025, 12(7), 647; https://doi.org/10.3390/aerospace12070647 - 21 Jul 2025
Viewed by 222
Abstract
In the framework of the MATIM project, an acoustic test campaign was conducted on a platform derived from a commercial-class quadcopter within the CIRA semi-anechoic chamber. A dedicated rotor rig allowed systematic measurements of thrust, torque, and shaft speed together with near- and [...] Read more.
In the framework of the MATIM project, an acoustic test campaign was conducted on a platform derived from a commercial-class quadcopter within the CIRA semi-anechoic chamber. A dedicated rotor rig allowed systematic measurements of thrust, torque, and shaft speed together with near- and far-field noise using ten calibrated 1/2-inch precision microphones. Three configurations were examined: an isolated rotor, the same rotor mounted on an aluminium quadcopter plate, and the full four-rotor assembly. The resulting data set, acquired over 3000–8000 rpm, documents the azimuthal directivity and radial decay of tonal and broadband noise while separating motor, propeller, and installation contributions. Analysis shows that a nearby rigid plate scatters part of the sound field towards frontal and oblique observers and produces a shielding effect in the rotor plane. The combined operation of four rotors further redistributes energy and broadens blade-passing frequency harmonics. The database is intended as a benchmark for aeroacoustics codes and for the development of reduced-order models. Full article
Show Figures

Figure 1

34 pages, 3579 KiB  
Review
A Comprehensive Review of Mathematical Error Characterization and Mitigation Strategies in Terrestrial Laser Scanning
by Mansoor Sabzali and Lloyd Pilgrim
Remote Sens. 2025, 17(14), 2528; https://doi.org/10.3390/rs17142528 - 20 Jul 2025
Viewed by 395
Abstract
In recent years, there has been an increasing transition from 1D point-based to 3D point-cloud-based data acquisition for monitoring applications and deformation analysis tasks. Previously, many studies relied on point-to-point measurements using total stations to assess structural deformation. However, the introduction of terrestrial [...] Read more.
In recent years, there has been an increasing transition from 1D point-based to 3D point-cloud-based data acquisition for monitoring applications and deformation analysis tasks. Previously, many studies relied on point-to-point measurements using total stations to assess structural deformation. However, the introduction of terrestrial laser scanning (TLS) has commenced a new era in data capture with a high level of efficiency and flexibility for data collection and post processing. Thus, a robust understanding of both data acquisition and processing techniques is required to guarantee high-quality deliverables to geometrically separate the measurement uncertainty and movements. TLS is highly demanding in capturing detailed 3D point coordinates of a scene within either short- or long-range scanning. Although various studies have examined scanner misalignments under controlled conditions within the short range of observation (scanner calibration), there remains a knowledge gap in understanding and characterizing errors related to long-range scanning (scanning calibration). Furthermore, limited information on manufacturer-oriented calibration tests highlights the motivation for designing a user-oriented calibration test. This research focused on investigating four primary sources of error in the generic error model of TLS. These were categorized into four geometries: instrumental imperfections related to the scanner itself, atmospheric effects that impact the laser beam, scanning geometry concerning the setup and varying incidence angles during scanning, and object and surface characteristics affecting the overall data accuracy. This study presents previous findings of TLS calibration relevant to the four error sources and mitigation strategies and identified current challenges that can be implemented as potential research directions. Full article
Show Figures

Figure 1

11 pages, 219 KiB  
Article
Diagnostic Accuracy of a Machine Learning-Derived Appendicitis Score in Children: A Multicenter Validation Study
by Emrah Aydın, Taha Eren Sarnıç, İnan Utku Türkmen, Narmina Khanmammadova, Ufuk Ateş, Mustafa Onur Öztan, Tamer Sekmenli, Necip Fazıl Aras, Tülin Öztaş, Ali Yalçınkaya, Murat Özbek, Deniz Gökçe, Hatice Sonay Yalçın Cömert, Osman Uzunlu, Aliye Kandırıcı, Nazile Ertürk, Alev Süzen, Fatih Akova, Mehmet Paşaoğlu, Egemen Eroğlu, Gülnur Göllü Bahadır, Ahmet Murat Çakmak, Salim Bilici, Ramazan Karabulut, Mustafa İmamoğlu, Haluk Sarıhan and Süleyman Cüneyt Karakuşadd Show full author list remove Hide full author list
Children 2025, 12(7), 937; https://doi.org/10.3390/children12070937 - 16 Jul 2025
Viewed by 582
Abstract
Background: Accurate diagnosis of acute appendicitis in children remains challenging due to variable presentations and limitations of existing clinical scoring systems. While machine learning (ML) offers a promising approach to enhance diagnostic precision, most prior studies have been limited by small sample [...] Read more.
Background: Accurate diagnosis of acute appendicitis in children remains challenging due to variable presentations and limitations of existing clinical scoring systems. While machine learning (ML) offers a promising approach to enhance diagnostic precision, most prior studies have been limited by small sample sizes, single-center data, or a lack of external validation. Methods: This prospective, multicenter study included 8586 pediatric patients to develop a machine learning-based diagnostic model using routinely available clinical and hematological parameters. A separate, prospectively collected external validation cohort of 3000 patients was used to assess model performance. The Random Forest algorithm was selected based on its superior performance during model comparison. Diagnostic accuracy, sensitivity, specificity, Area Under Curve (AUC), and calibration metrics were evaluated and compared with traditional scoring systems such as Pediatric Appendicitis Score (PAS), Alvarado, and Appendicitis Inflammatory Response Score (AIRS). Results: The ML model outperformed traditional clinical scores in both development and validation cohorts. In the external validation set, the Random Forest model achieved an AUC of 0.996, accuracy of 0.992, sensitivity of 0.998, and specificity of 0.993. Feature-importance analysis identified white blood cell count, red blood cell count, and mean platelet volume as key predictors. Conclusions: This large, prospectively validated study demonstrates that a machine learning-based scoring system using commonly accessible data can significantly improve the diagnosis of pediatric appendicitis. The model offers high accuracy and clinical interpretability and has the potential to reduce diagnostic delays and unnecessary imaging. Full article
(This article belongs to the Section Global Pediatric Health)
16 pages, 689 KiB  
Article
Quantification of Total and Unbound Selinexor Concentrations in Human Plasma by a Fully Validated Liquid Chromatography-Tandem Mass Spectrometry Method
by Suhyun Lee, Seungwon Yang, Hyeonji Kim, Wang-Seob Shim, Eunseo Song, Seunghoon Han, Sung-Soo Park, Suein Choi, Sungpil Han, Sung Hwan Joo, Seok Jun Park, Beomjin Shin, Donghyun Kim, Hyeon Su Kim, Kyung-Tae Lee and Eun Kyoung Chung
Pharmaceutics 2025, 17(7), 919; https://doi.org/10.3390/pharmaceutics17070919 - 16 Jul 2025
Viewed by 320
Abstract
Background/Objectives: Selinexor is a selective nuclear-export inhibitor approved for hematologic malignancies, characterized by extensive plasma protein binding (>95%). However, a validated analytical method to accurately measure the clinically relevant unbound fraction of selinexor in human plasma has not yet been established. This study [...] Read more.
Background/Objectives: Selinexor is a selective nuclear-export inhibitor approved for hematologic malignancies, characterized by extensive plasma protein binding (>95%). However, a validated analytical method to accurately measure the clinically relevant unbound fraction of selinexor in human plasma has not yet been established. This study aimed to develop a fully validated bioanalytical assay for simultaneous quantification of total and unbound selinexor concentrations in human plasma. Methods: We established and fully validated an analytical method based on liquid chromatography–tandem mass spectrometry (LC-MS/MS) capable of quantifying total and unbound selinexor concentrations in human plasma. Unbound selinexor was separated using ultrafiltration, and selinexor was efficiently extracted from 50 μL of plasma by liquid–liquid extraction. Chromatographic separation was achieved on a C18 column using an isocratic mobile phase (0.1% formic acid:methanol, 12:88 v/v) with a relatively short runtime of 2.5 min. Results: Calibration curves showed excellent linearity over a range of 5–2000 ng/mL for total selinexor (r2 ≥ 0.998) and 0.05–20 ng/mL for unbound selinexor (r2 ≥ 0.995). The precision (%CV ≤ 10.35%) and accuracy (92.5–104.3%) for both analytes met the regulatory criteria. This method successfully quantified selinexor in plasma samples from renally impaired patients with multiple myeloma, demonstrating potential inter-individual differences in unbound drug concentrations. Conclusions: This validated bioanalytical assay enables precise clinical pharmacokinetic assessments in a short runtime using a small plasma volume and, thus, assists in individualized dosing of selinexor, particularly for renally impaired patients with altered protein binding. Full article
(This article belongs to the Section Pharmacokinetics and Pharmacodynamics)
Show Figures

Figure 1

16 pages, 5918 KiB  
Article
Effects of Climate Change and Human Activities on the Flow of the Muling River
by Xiang Meng, Chang-Lei Dai, Yi-Ding Zhang, Geng-Wei Liu, Xiao Yang and Xue Feng
Hydrology 2025, 12(7), 180; https://doi.org/10.3390/hydrology12070180 - 3 Jul 2025
Viewed by 259
Abstract
In the context of global warming and the intensification of human activities, the change in runoff is also increasing. It is very important to determine the change in runoff for the rational utilization of water resources. In order to determine the influencing factors [...] Read more.
In the context of global warming and the intensification of human activities, the change in runoff is also increasing. It is very important to determine the change in runoff for the rational utilization of water resources. In order to determine the influencing factors of runoff change in Muling River, the SWAT model was used in this study to separate different coupling factors and calculate the contribution rate of a single factor to runoff change at the annual scale and quarterly scale, respectively. In the process of calibration, different single rate times were used to analyze the influence of different rate times on the calibration results. The results show that the runoff in the Muling River basin shows a downward trend, the quarterly temperature factor has the greatest influence on the runoff change, which is 50–60%, the annual precipitation has the greatest influence on the runoff change, which is 68%, and the maximum change in the runoff from the reservoir is 42.5% under the change in human activities. In the SWAT-CUP software, the optimal number of calibration for this basin is 500. This research provides a scientific basis for the flow analysis of the Muling River basin. Full article
Show Figures

Figure 1

19 pages, 1436 KiB  
Article
Development and Validation of Bioanalytical LC–MS/MS Method for Pharmacokinetic Assessment of Amoxicillin and Clavulanate in Human Plasma
by Sangyoung Lee, Da Hyun Kim, Sabin Shin, Jee Sun Min, Duk Yeon Kim, Seong Jun Jo, Ui Min Jerng and Soo Kyung Bae
Pharmaceuticals 2025, 18(7), 998; https://doi.org/10.3390/ph18070998 - 2 Jul 2025
Viewed by 481
Abstract
Background/Objectives: We developed and validated a robust and simple LC–MS/MS method for the simultaneous quantification of amoxicillin and clavulanate in human plasma relative to previously reported methods. Methods: Amoxicillin; clavulanate; and an internal standard, 4-hydroxytolbutamide, in human K2-EDTA plasma, [...] Read more.
Background/Objectives: We developed and validated a robust and simple LC–MS/MS method for the simultaneous quantification of amoxicillin and clavulanate in human plasma relative to previously reported methods. Methods: Amoxicillin; clavulanate; and an internal standard, 4-hydroxytolbutamide, in human K2-EDTA plasma, were deproteinized with acetonitrile and then subjected to back-extraction using distilled water–dichloromethane. Separation was performed on a Poroshell 120 EC-C18 column with a mobile-phase gradient comprising 0.1% aqueous formic acid and acetonitrile at a flow rate of 0.5 mL/min within 6.5 min. The negative electrospray ionization modes were utilized to monitor the transitions of m/z 363.9→223.1 (amoxicillin), m/z 198.0→135.8 (clavulanate), and m/z 285.0→185.8 (4-hydroxytolbutamide). Results/Conclusions: Calibration curves exhibited linear ranges of 10–15,000 ng/mL for amoxicillin (r ≥ 0.9945) and 20–10,000 ng/mL for clavulanate (r ≥ 0.9959). Intra- and inter-day’s coefficients of variation, indicating the precision of the assay, were ≤7.08% for amoxicillin and ≤10.7% for clavulanate, and relative errors in accuracy ranged from −1.26% to 10.9% for amoxicillin and from −4.41% to 8.73% for clavulanate. All other validation results met regulatory criteria. Partial validation in lithium–heparin, sodium–heparin, and K3-EDTA plasma confirmed applicability in multicenter or large-scale studies. This assay demonstrated itself to be environmentally friendly, as assessed by the Analytical GREEnness (AGREE) tool, and was successfully applied to a clinical pharmacokinetic study of an Augmentin® IR tablet (250/125 mg). The inter-individual variabilities in clavulanate exposures (AUCt and Cmax) were significantly greater than in amoxicillin, and they may inform the clinical design of future drug–drug interaction. Full article
Show Figures

Graphical abstract

24 pages, 8519 KiB  
Article
Probing Equatorial Ionospheric TEC at Sub-GHz Frequencies with Wide-Band (B4) uGMRT Interferometric Data
by Dipanjan Banerjee, Abhik Ghosh, Sushanta K. Mondal and Parimal Ghosh
Universe 2025, 11(7), 210; https://doi.org/10.3390/universe11070210 - 26 Jun 2025
Viewed by 294
Abstract
Phase stability at low radio frequencies is severely impacted by ionospheric propagation delays. Radio interferometers such as the giant metrewave radio telescope (GMRT) are capable of detecting changes in the ionosphere’s total electron content (TEC) over larger spatial scales and with greater sensitivity [...] Read more.
Phase stability at low radio frequencies is severely impacted by ionospheric propagation delays. Radio interferometers such as the giant metrewave radio telescope (GMRT) are capable of detecting changes in the ionosphere’s total electron content (TEC) over larger spatial scales and with greater sensitivity compared to conventional tools like the global navigation satellite system (GNSS). Thanks to its unique design, featuring both a dense central array and long outer arms, and its strategic location, the GMRT is particularly well-suited for studying the sensitive ionospheric region located between the northern peak of the equatorial ionization anomaly (EIA) and the magnetic equator. In this study, we observe the bright flux calibrator 3C48 for ten hours to characterize and study the low-latitude ionosphere with the upgraded GMRT (uGMRT). We outline the methods used for wideband data reduction and processing to accurately measure differential TEC (δTEC) between antenna pairs, achieving a precision of< mTECU (1 mTECU = 103 TECU) for central square antennas and approximately mTECU for arm antennas. The measured δTEC values are used to estimate the TEC gradient across GMRT arm antennas. We measure the ionospheric phase structure function and find a power-law slope of β=1.72±0.07, indicating deviations from pure Kolmogorov turbulence. The inferred diffractive scale, the spatial separation over which the phase variance reaches 1rad2, is ∼6.66 km. The small diffractive scale implies high phase variability across the field of view and reduced temporal coherence, which poses challenges for calibration and imaging. Full article
(This article belongs to the Section Planetary Sciences)
Show Figures

Figure 1

16 pages, 2704 KiB  
Article
Shear Capacity of Masonry Walls Externally Strengthened via Reinforced Khorasan Jacketing
by Cagri Mollamahmutoglu, Mehdi Ozturk and Mehmet Ozan Yilmaz
Buildings 2025, 15(13), 2177; https://doi.org/10.3390/buildings15132177 - 22 Jun 2025
Viewed by 339
Abstract
This study investigates the in-plane shear behavior of solid brick masonry walls, both unreinforced and retrofitted using Reinforced Khorasan Jacketing (RHJ), a traditional pozzolanic mortar technique rooted in Iranian and Ottoman architecture. Six one-block-thick English bond masonry walls were tested in three configurations: [...] Read more.
This study investigates the in-plane shear behavior of solid brick masonry walls, both unreinforced and retrofitted using Reinforced Khorasan Jacketing (RHJ), a traditional pozzolanic mortar technique rooted in Iranian and Ottoman architecture. Six one-block-thick English bond masonry walls were tested in three configurations: unreinforced with Horasan plaster (Group I), reinforced with steel mesh aligned to wall edges (Group II), and reinforced with mesh aligned diagonally (Group III). All the walls were plastered with 3.5 cm of Horasan mortar and tested after 18 months using diagonal compression, with load-displacement data recorded. A detailed 3D micro-modeling approach was employed in finite element simulations, with bricks and mortar modeled separately. The Horasan mortar was represented using an elastoplastic Mohr-Coulomb model with a custom softening law (parabolic-to-exponential), calibrated via inverse parameter fitting using the Nelder-Mead algorithm. The numerical predictions closely matched the experimental data. Reinforcement improved the shear strength significantly: Group II showed a 1.8 times increase, and Group III up to 2.7 times. Ductility, measured as post-peak deformation capacity, increased by factors of two (parallel) and three (diagonal). These enhancements transformed the brittle failure mode into a more ductile, energy-absorbing behavior. RHJ is shown to be a compatible, effective retrofit solution for historic masonry structures. Full article
(This article belongs to the Section Building Structures)
Show Figures

Figure 1

21 pages, 1792 KiB  
Article
Assessment of Baseflow Separation Methods Used in the Estimations of Design-Related Storm Hydrographs Across Various Return Periods
by Oscar E. Coronado-Hernández, Rafael D. Méndez-Anillo and Manuel Saba
Hydrology 2025, 12(6), 158; https://doi.org/10.3390/hydrology12060158 - 19 Jun 2025
Viewed by 474
Abstract
Accurately estimating storm hydrographs for various return periods is crucial for planning and designing hydrological infrastructure, such as dams and drainage systems. A key aspect of this estimation is the separation of baseflow from storm runoff. This study proposes a method for deriving [...] Read more.
Accurately estimating storm hydrographs for various return periods is crucial for planning and designing hydrological infrastructure, such as dams and drainage systems. A key aspect of this estimation is the separation of baseflow from storm runoff. This study proposes a method for deriving storm hydrographs for different return periods based on hydrological station records. The proposed approach uses three baseflow separation methods: constant, linear, and master recession curve. A significant advantage of the proposed method over traditional rainfall–runoff approaches is its minimal parameter requirements during calibration. The methodology is tested on records from the Lengupá River watershed in Colombia, using data from the Páez hydrological station, which has a drainage area of 1090 km2. The results indicate that the linear method yields the most accurate hydrograph estimates, as demonstrated by its lower root mean square error (RMSE) of 0.35%, compared to the other baseflow separation techniques, the values of which range from 2.92 to 3.02%. A frequency analysis of hydrological data was conducted using Pearson Type III and Generalized Extreme Value distributions to identify the most suitable statistical models for estimating extreme events regarding peak flow and maximum storm hydrograph volume. The findings demonstrate that the proposed methods effectively reproduce storm hydrographs for return periods ranging from 5 to 200 years, providing valuable insights for hydrological design, which can be employed using the data from stream gauging stations in rivers. Full article
Show Figures

Figure 1

Back to TopTop