Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (6,018)

Search Parameters:
Keywords = novelty

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 2183 KiB  
Article
Effective Endotoxin Reduction in Hospital Reverse Osmosis Water Using eBooster™ Electrochemical Technology
by José Eudes Lima Santos, Letícia Gracyelle Alexandre Costa, Carlos Alberto Martínez-Huitle and Sergio Ferro
Water 2025, 17(15), 2353; https://doi.org/10.3390/w17152353 (registering DOI) - 7 Aug 2025
Abstract
Endotoxins, lipopolysaccharides released from the outer membrane of Gram-negative bacteria, pose a significant risk in healthcare environments, particularly in Central Sterile Supply Departments (CSSDs), where the delivery of sterile pyrogen-free medical devices is critical for patient safety. Traditional methods for controlling endotoxins in [...] Read more.
Endotoxins, lipopolysaccharides released from the outer membrane of Gram-negative bacteria, pose a significant risk in healthcare environments, particularly in Central Sterile Supply Departments (CSSDs), where the delivery of sterile pyrogen-free medical devices is critical for patient safety. Traditional methods for controlling endotoxins in water systems, such as ultraviolet (UV) disinfection, have proven ineffective at reducing endotoxin concentrations to comply with regulatory standards (<0.25 EU/mL). This limitation presents a significant challenge, especially in the context of reverse osmosis (RO) permeate used in CSSDs, where water typically has very low conductivity. Despite the established importance of endotoxin removal, a gap in the literature exists regarding effective chemical-free methods that can meet the stringent endotoxin limits in such low-conductivity environments. This study addresses this gap by evaluating the effectiveness of the eBooster™ electrochemical technology—featuring proprietary electrode materials and a reactor design optimized for potable water—for endotoxin removal from water, specifically under the low-conductivity conditions typical of RO permeate. Laboratory experiments using the B250 reactor achieved >90% endotoxin reduction (from 1.2 EU/mL to <0.1 EU/mL) at flow rates ≤5 L/min and current densities of 0.45–2.7 mA/cm2. Additional real-world testing at three hospitals showed that the eBooster™ unit, when installed in the RO tank recirculation loop, consistently reduced endotoxin levels from 0.76 EU/mL (with UV) to <0.05 EU/mL over 24 months of operation, while heterotrophic plate counts dropped from 190 to <1 CFU/100 mL. Statistical analysis confirmed the reproducibility and flow-rate dependence of the removal efficiency. Limitations observed included reduced efficacy at higher flow rates, the need for sufficient residence time, and a temporary performance decline after two years due to a power fault, which was promptly corrected. Compared to earlier approaches, eBooster™ demonstrated superior performance in low-conductivity environments without added chemicals or significant maintenance. These findings highlight the strength and novelty of eBooster™ as a reliable, chemical-free, and maintenance-friendly alternative to traditional UV disinfection systems, offering a promising solution for critical water treatment applications in healthcare environments. Full article
17 pages, 307 KiB  
Article
“The Language of the Digital Air”: AI-Generated Literature and the Performance of Authorship
by Silvana Colella
Humanities 2025, 14(8), 164; https://doi.org/10.3390/h14080164 (registering DOI) - 7 Aug 2025
Abstract
The release of ChatGPT and similar applications in 2022 prompted wide-ranging discussions concerning the impact of AI technologies on writing, creativity, and authorship. This article explores the question of artificial writing, taking into consideration both critical theories and creative experiments. In the first [...] Read more.
The release of ChatGPT and similar applications in 2022 prompted wide-ranging discussions concerning the impact of AI technologies on writing, creativity, and authorship. This article explores the question of artificial writing, taking into consideration both critical theories and creative experiments. In the first section, I review current scholarly discussions about authorship in the age of generative AI. In the second and third sections, I turn to experiments in literary co-creation that combine the affordances of technology with the human art of prompting and editing or curating. My argument has three prongs: (1) experiments that frame artificial writing as literature (memoir, poetry, autobiography, fiction) are accompanied by enlarged paratexts, which merit more attention than they have hitherto received; (2) paratexts provide salient clues on the process of co-creation, the reconfiguration of authorship, and the production of value; and (3) in the folds of paratextual explanations, one can detect the profile of the author as clever prompter, navigating a new terrain by relying at times on the certainties of conventional authorship. My analyses show that while AI-generated literature is a novel phenomenon worthy of closer scrutiny, the novelty tends to be cloaked in a familiar garb. Full article
23 pages, 714 KiB  
Article
Thermodynamic Analysis of Biomass Pyrolysis in an Auger Reactor Coupled with a Fluidized-Bed Reactor for Catalytic Deoxygenation
by Balkydia Campusano, Michael Jabbour, Lokmane Abdelouahed and Bechara Taouk
Processes 2025, 13(8), 2496; https://doi.org/10.3390/pr13082496 (registering DOI) - 7 Aug 2025
Abstract
This research contributes to advance the sustainable production of biofuels and provides insights into the energy and exergy assessment of bio-oil, which is essential for developing environmentally friendly energy production solutions. Energy and exergy analyses were performed to evaluate the pyrolysis of beech [...] Read more.
This research contributes to advance the sustainable production of biofuels and provides insights into the energy and exergy assessment of bio-oil, which is essential for developing environmentally friendly energy production solutions. Energy and exergy analyses were performed to evaluate the pyrolysis of beech wood biomass at 500 °C in an Auger reactor. To improve the quality of the obtained bio-oil, its catalytic deoxygenation was performed within an in-line fluidized catalytic bed reactor using a catalyst based on HZSM5 zeolite modified with 5 wt.% Iron (5%FeHZSM-5). A thermodynamic analysis of the catalytic and non-catalytic pyrolysis system was carried out, as well as a comparative study of the calculation methods for the energy and exergy evaluation for bio-oil. The required heat for pyrolysis was found to be 1.2 MJ/kgbiomass in the case of non-catalytic treatment and 3.46 MJ/kgbiomass in the presence of the zeolite-based catalyst. The exergy efficiency in the Auger reactor was 90.3%. Using the catalytic system coupled to the Auger reactor, this efficiency increased to 91.6%, leading to less energy degradation. Calculating the total energy and total exergy of the bio-oil using two different methods showed a difference of 6%. In the first method, only the energy contributions of the model compounds, corresponding to the major compounds of each chemical family of bio-oil, were considered. In contrast, in the second method, all molecules identified in the bio-oil were considered for the calculation. The second method proved to be more suitable for thermodynamic analysis. The novelties of this work concern the thermodynamic analysis of a coupled system of an Auger biomass pyrolysis reactor and a fluidized bed catalytic deoxygenation reactor on the one hand, and the use of all the molecules identified in the oily phase for the evaluation of energy and exergy on the other hand. Full article
(This article belongs to the Section Chemical Processes and Systems)
23 pages, 273 KiB  
Article
Checklist of the Tribe Eucosmini Obraztsov, 1946 (Lepidoptera: Tortricidae: Olethreutinae) from Taiwan
by Yinghui Sun, John W. Brown, Ming Liu, Qiangcheng Zeng and Houhun Li
Insects 2025, 16(8), 819; https://doi.org/10.3390/insects16080819 - 7 Aug 2025
Abstract
This study presents an updated and detailed inventory of the tortricid tribe Eucosmini found in Taiwan Province, China, highlighting 26 genera and 53 species. Several taxonomic novelties were revealed: The genus Coenobiodes is newly recorded for Taiwan Province, Hermenias semicurva is newly reported [...] Read more.
This study presents an updated and detailed inventory of the tortricid tribe Eucosmini found in Taiwan Province, China, highlighting 26 genera and 53 species. Several taxonomic novelties were revealed: The genus Coenobiodes is newly recorded for Taiwan Province, Hermenias semicurva is newly reported for China, and six species are newly recorded for Taiwan. Most Eucosmini species in Taiwan are widespread in eastern Asia and/or the Palearctic, but 12 (23%) are endemic to Taiwan. Biogeographical distributions are provided for each species, and a list of the specimens examined is provided where applicable. These findings underscore Taiwan’s status as a biodiversity hotspot and offer crucial data for understanding regional and global biodiversity patterns. Full article
(This article belongs to the Section Insect Systematics, Phylogeny and Evolution)
13 pages, 382 KiB  
Article
Determination of Stiffness Coefficients at the Internal Vertices of the Tree Based on a Finite Set of Eigenvalues of an Asymmetric Second-Order Linear Differential Operator
by Baltabek Kanguzhin, Zhalgas Kaiyrbek and Mergul Mustafina
Symmetry 2025, 17(8), 1263; https://doi.org/10.3390/sym17081263 - 7 Aug 2025
Abstract
A second-order linear differential operator A is defined on a tree of arbitrary topology. Any internal vertex P of the tree divides the tree into mp branches. The restrictions Ai,i=1,,mp of the [...] Read more.
A second-order linear differential operator A is defined on a tree of arbitrary topology. Any internal vertex P of the tree divides the tree into mp branches. The restrictions Ai,i=1,,mp of the operator A on each of these branches, where the vertex P is considered the root of the branch and the Dirichlet boundary condition is specified at the root. These restrictions must be, in a sense, asymmetric (not similar) to each other. Thus, the distinguished class of differential operators A turns out to have only simple eigenvalues. Moreover, the matching conditions at the internal vertices of the graph contain a set of parameters. These parameters are interpreted as stiffness coefficients. This paper proves that a finite set of eigenvalues allows one to uniquely restore the set of stiffness coefficients. The novelty of the work is the fact that it is sufficient to know a finite set of eigenvalues of intermediate Weinstein problems for uniquely restoring the required stiffness coefficients. We not only describe the results of selected studies but also compare them with each other and establish interconnections. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

18 pages, 973 KiB  
Article
Machine Learning-Based Vulnerability Detection in Rust Code Using LLVM IR and Transformer Model
by Young Lee, Syeda Jannatul Boshra, Jeong Yang, Zechun Cao and Gongbo Liang
Mach. Learn. Knowl. Extr. 2025, 7(3), 79; https://doi.org/10.3390/make7030079 - 6 Aug 2025
Abstract
Rust’s growing popularity in high-integrity systems requires automated vulnerability detection in order to maintain its strong safety guarantees. Although Rust’s ownership model and compile-time checks prevent many errors, sometimes unexpected bugs may occasionally pass analysis, underlining the necessity for automated safe and unsafe [...] Read more.
Rust’s growing popularity in high-integrity systems requires automated vulnerability detection in order to maintain its strong safety guarantees. Although Rust’s ownership model and compile-time checks prevent many errors, sometimes unexpected bugs may occasionally pass analysis, underlining the necessity for automated safe and unsafe code detection. This paper presents Rust-IR-BERT, a machine learning approach to detect security vulnerabilities in Rust code by analyzing its compiled LLVM intermediate representation (IR) instead of the raw source code. This approach offers novelty by employing LLVM IR’s language-neutral, semantically rich representation of the program, facilitating robust detection by capturing core data and control-flow semantics and reducing language-specific syntactic noise. Our method leverages a graph-based transformer model, GraphCodeBERT, which is a transformer architecture pretrained model to encode structural code semantics via data-flow information, followed by a gradient boosting classifier, CatBoost, that is capable of handling complex feature interactions—to classify code as vulnerable or safe. The model was evaluated using a carefully curated dataset of over 2300 real-world Rust code samples (vulnerable and non-vulnerable Rust code snippets) from RustSec and OSV advisory databases, compiled to LLVM IR and labeled with corresponding Common Vulnerabilities and Exposures (CVEs) identifiers to ensure comprehensive and realistic coverage. Rust-IR-BERT achieved an overall accuracy of 98.11%, with a recall of 99.31% for safe code and 93.67% for vulnerable code. Despite these promising results, this study acknowledges potential limitations such as focusing primarily on known CVEs. Built on a representative dataset spanning over 2300 real-world Rust samples from diverse crates, Rust-IR-BERT delivers consistently strong performance. Looking ahead, practical deployment could take the form of a Cargo plugin or pre-commit hook that automatically generates and scans LLVM IR artifacts during the development cycle, enabling developers to catch vulnerabilities at an early stage in the development cycle. Full article
Show Figures

Figure 1

19 pages, 1185 KiB  
Article
PredictMed-CDSS: Artificial Intelligence-Based Decision Support System Predicting the Probability to Develop Neuromuscular Hip Dysplasia
by Carlo M. Bertoncelli, Federico Solla, Michal Latalski, Sikha Bagui, Subhash C. Bagui, Stefania Costantini and Domenico Bertoncelli
Bioengineering 2025, 12(8), 846; https://doi.org/10.3390/bioengineering12080846 - 6 Aug 2025
Abstract
Neuromuscular hip dysplasia (NHD) is a common deformity in children with cerebral palsy (CP). Although some predictive factors of NHD are known, the prediction of NHD is in its infancy. We present a Clinical Decision Support System (CDSS) designed to calculate the probability [...] Read more.
Neuromuscular hip dysplasia (NHD) is a common deformity in children with cerebral palsy (CP). Although some predictive factors of NHD are known, the prediction of NHD is in its infancy. We present a Clinical Decision Support System (CDSS) designed to calculate the probability of developing NHD in children with CP. The system utilizes an ensemble of three machine learning (ML) algorithms: Neural Network (NN), Support Vector Machine (SVM), and Logistic Regression (LR). The development and evaluation of the CDSS followed the DECIDE-AI guidelines for AI-driven clinical decision support tools. The ensemble was trained on a data series from 182 subjects. Inclusion criteria were age between 12 and 18 years and diagnosis of CP from two specialized units. Clinical and functional data were collected prospectively between 2005 and 2023, and then analyzed in a cross-sectional study. Accuracy and area under the receiver operating characteristic (AUROC) were calculated for each method. Best logistic regression scores highlighted history of previous orthopedic surgery (p = 0.001), poor motor function (p = 0.004), truncal tone disorder (p = 0.008), scoliosis (p = 0.031), number of affected limbs (p = 0.05), and epilepsy (p = 0.05) as predictors of NHD. Both accuracy and AUROC were highest for NN, 83.7% and 0.92, respectively. The novelty of this study lies in the development of an efficient Clinical Decision Support System (CDSS) prototype, specifically designed to predict future outcomes of neuromuscular hip dysplasia (NHD) in patients with cerebral palsy (CP) using clinical data. The proposed system, PredictMed-CDSS, demonstrated strong predictive performance for estimating the probability of NHD development in children with CP, with the highest accuracy achieved using neural networks (NN). PredictMed-CDSS has the potential to assist clinicians in anticipating the need for early interventions and preventive strategies in the management of NHD among CP patients. Full article
Show Figures

Figure 1

30 pages, 8483 KiB  
Article
Research on Innovative Design of Two-in-One Portable Electric Scooter Based on Integrated Industrial Design Method
by Yang Zhang, Xiaopu Jiang, Shifan Niu and Yi Zhang
Sustainability 2025, 17(15), 7121; https://doi.org/10.3390/su17157121 - 6 Aug 2025
Abstract
With the advancement of low-carbon and sustainable development initiatives, electric scooters, recognized as essential transportation tools and leisure products, have gained significant popularity, particularly among young people. However, the current electric scooter market is plagued by severe product similarity. Once the initial novelty [...] Read more.
With the advancement of low-carbon and sustainable development initiatives, electric scooters, recognized as essential transportation tools and leisure products, have gained significant popularity, particularly among young people. However, the current electric scooter market is plagued by severe product similarity. Once the initial novelty fades for users, the usage frequency declines, resulting in considerable resource wastage. This research collected user needs via surveys and employed the KJ method (affinity diagram) to synthesize fragmented insights into cohesive thematic clusters. Subsequently, a hierarchical needs model for electric scooters was constructed using analytical hierarchy process (AHP) principles, enabling systematic prioritization of user requirements through multi-criteria evaluation. By establishing a house of quality (HoQ), user needs were transformed into technical characteristics of electric scooter products, and the corresponding weights were calculated. After analyzing the positive and negative correlation degrees of the technical characteristic indicators, it was found that there are technical contradictions between functional zoning and compact size, lightweight design and material structure, and smart interaction and usability. Then, based on the theory of inventive problem solving (TRIZ), the contradictions were classified, and corresponding problem-solving principles were identified to achieve a multi-functional innovative design for electric scooters. This research, leveraging a systematic industrial design analysis framework, identified critical pain points among electric scooter users, established hierarchical user needs through priority ranking, and improved product lifecycle sustainability. It offers novel methodologies and perspectives for advancing theoretical research and design practices in the electric scooter domain. Full article
Show Figures

Figure 1

15 pages, 1835 KiB  
Article
Stress Development in Droplet Impact Analysis of Rain Erosion Damage on Wind Turbine Blades: A Review of Liquid-to-Solid Contact Conditions
by Quentin Laplace Oddo, Quaiyum M. Ansari, Fernando Sánchez, Leon Mishnaevsky and Trevor M. Young
Appl. Sci. 2025, 15(15), 8682; https://doi.org/10.3390/app15158682 - 6 Aug 2025
Abstract
The wind energy sector is experiencing substantial growth, with global wind turbine capacity increasing and projected to expand further in the coming years. However, rain erosion on the leading edges of turbine blades remains a significant challenge, affecting both aerodynamic efficiency and structural [...] Read more.
The wind energy sector is experiencing substantial growth, with global wind turbine capacity increasing and projected to expand further in the coming years. However, rain erosion on the leading edges of turbine blades remains a significant challenge, affecting both aerodynamic efficiency and structural longevity. The associated degradation reduces annual energy production and leads to high maintenance costs due to frequent inspections and repairs. To address this issue, researchers have developed numerical models to predict blade erosion caused by water droplet impacts. This study presents a finite element analysis model in Abaqus to simulate the interaction between a single water droplet and wind turbine blade material. The novelty of this model lies in evaluating the influence of several parameters on von Mises and S33 peak stresses in the leading-edge protection, such as friction coefficient, type of contact, impact velocity, and droplet diameter. The findings provide insights into optimising LEP numerical models to simulate rain erosion as closely as possible to real-world scenarios. Full article
Show Figures

Figure 1

18 pages, 1974 KiB  
Article
GoSS-Rec: Group-Oriented Segment Sequence Recommendation
by Marco Aguirre, Lorena Recalde and Edison Loza-Aguirre
Information 2025, 16(8), 668; https://doi.org/10.3390/info16080668 - 6 Aug 2025
Abstract
In recent years, the advancement of various applications, data mining, technologies, and socio-technical systems has led to the development of interactive platforms that enhance user experiences through personalization. In the sports domain, users can access training plans, routes and healthy habits, all in [...] Read more.
In recent years, the advancement of various applications, data mining, technologies, and socio-technical systems has led to the development of interactive platforms that enhance user experiences through personalization. In the sports domain, users can access training plans, routes and healthy habits, all in a personalized way thanks to sports recommender systems. These recommendation engines are fueled by rich datasets that are collected through continuous monitoring of users’ activities. However, their potential to address user profiling is limited to single users and not to the dynamics of groups of sportsmen. This paper introduces GoSS-Rec, a Group-oriented Segment Sequence Recommender System, which is designed for groups of cyclists who participate in fitness activities. The system analyzes collective preferences and activity records to provide personalized route recommendations that encourage exploration of diverse cycling paths and also enhance group activities. Our experiments show that GoSS-Rec, which is based on Prod2vec, consistently outperforms other models on diversity and novelty, regardless of the group size. This indicates the potential of our model to provide unique and customized suggestions, making GoSS-Rec a remarkable innovation in the field of sports recommender systems. It also expands the possibilities of personalized experiences beyond traditional areas. Full article
Show Figures

Graphical abstract

23 pages, 3831 KiB  
Article
Estimating Planetary Boundary Layer Height over Central Amazonia Using Random Forest
by Paulo Renato P. Silva, Rayonil G. Carneiro, Alison O. Moraes, Cleo Quaresma Dias-Junior and Gilberto Fisch
Atmosphere 2025, 16(8), 941; https://doi.org/10.3390/atmos16080941 - 5 Aug 2025
Abstract
This study investigates the use of a Random Forest (RF), an artificial intelligence (AI) model, to estimate the planetary boundary layer height (PBLH) over Central Amazonia from climatic elements data collected during the GoAmazon experiment, held in 2014 and 2015, as it is [...] Read more.
This study investigates the use of a Random Forest (RF), an artificial intelligence (AI) model, to estimate the planetary boundary layer height (PBLH) over Central Amazonia from climatic elements data collected during the GoAmazon experiment, held in 2014 and 2015, as it is a key metric for air quality, weather forecasting, and climate modeling. The novelty of this study lies in estimating PBLH using only surface-based meteorological observations. This approach is validated against remote sensing measurements (e.g., LIDAR, ceilometer, and wind profilers), which are seldom available in the Amazon region. The dataset includes various meteorological features, though substantial missing data for the latent heat flux (LE) and net radiation (Rn) measurements posed challenges. We addressed these gaps through different data-cleaning strategies, such as feature exclusion, row removal, and imputation techniques, assessing their impact on model performance using the Root Mean Square Error (RMSE), Mean Absolute Error (MAE), and r2 metrics. The best-performing strategy achieved an RMSE of 375.9 m. In addition to the RF model, we benchmarked its performance against Linear Regression, Support Vector Regression, LightGBM, XGBoost, and a Deep Neural Network. While all models showed moderate correlation with observed PBLH, the RF model outperformed all others with statistically significant differences confirmed by paired t-tests. SHAP (SHapley Additive exPlanations) values were used to enhance model interpretability, revealing hour of the day, air temperature, and relative humidity as the most influential predictors for PBLH, underscoring their critical role in atmospheric dynamics in Central Amazonia. Despite these optimizations, the model underestimates the PBLH values—by an average of 197 m, particularly in the spring and early summer austral seasons when atmospheric conditions are more variable. These findings emphasize the importance of robust data preprocessing and higtextight the potential of ML models for improving PBLH estimation in data-scarce tropical environments. Full article
(This article belongs to the Special Issue Applications of Artificial Intelligence in Atmospheric Sciences)
Show Figures

Figure 1

29 pages, 3188 KiB  
Article
A Multimodal Bone Stick Matching Approach Based on Large-Scale Pre-Trained Models and Dynamic Cross-Modal Feature Fusion
by Tao Fan, Huiqin Wang, Ke Wang, Rui Liu and Zhan Wang
Appl. Sci. 2025, 15(15), 8681; https://doi.org/10.3390/app15158681 - 5 Aug 2025
Abstract
Among the approximately 60,000 bone stick fragments unearthed from the Weiyang Palace site of the Han Dynasty, about 57,000 bear inscriptions. Most of these fragments exhibit vertical fractures, leading to a separation between the upper and lower fragments, which poses significant challenges to [...] Read more.
Among the approximately 60,000 bone stick fragments unearthed from the Weiyang Palace site of the Han Dynasty, about 57,000 bear inscriptions. Most of these fragments exhibit vertical fractures, leading to a separation between the upper and lower fragments, which poses significant challenges to digital preservation and artifact restoration. Manual matching is inefficient and may cause further damage to the bone sticks. This paper proposes a novel multimodal bone stick matching approach that integrates image, inscription, and archeological information to enhance the accuracy and efficiency of matching fragmented bone stick artifacts. Unlike traditional methods that rely solely on image data, our method leverages large-scale pre-trained models, namely Vision-RWKV for visual feature extraction, RWKV for inscription analysis, and BERT for archeological metadata encoding. A dynamic cross-modal feature fusion mechanism is introduced to effectively combine these features, enabling better interaction and weighting based on the contextual relevance of each modality. This approach significantly improves matching performance, particularly in challenging cases involving fractures, corrosion, and missing sections. The novelty of this method lies in its ability to simultaneously extract and fuse multiple sources of information, addressing the limitations of traditional image-based matching methods. This paper uses Rank-N and Cumulative Match Characteristic (CMC) curves as evaluation metrics. Experimental evaluation shows that the matching accuracy reaches 94.73% at Rank-15, and the method performs significantly better than the comparative methods on the CMC evaluation curve, demonstrating outstanding performance. Overall, this approach significantly enhances the efficiency and accuracy of bone stick artifact matching, providing robust technical support for the research and restoration of bone stick cultural heritage. Full article
Show Figures

Figure 1

29 pages, 3266 KiB  
Article
Wavelet Multiresolution Analysis-Based Takagi–Sugeno–Kang Model, with a Projection Step and Surrogate Feature Selection for Spectral Wave Height Prediction
by Panagiotis Korkidis and Anastasios Dounis
Mathematics 2025, 13(15), 2517; https://doi.org/10.3390/math13152517 - 5 Aug 2025
Abstract
The accurate prediction of significant wave height presents a complex yet vital challenge in the fields of ocean engineering. This capability is essential for disaster prevention, fostering sustainable development and deepening our understanding of various scientific phenomena. We explore the development of a [...] Read more.
The accurate prediction of significant wave height presents a complex yet vital challenge in the fields of ocean engineering. This capability is essential for disaster prevention, fostering sustainable development and deepening our understanding of various scientific phenomena. We explore the development of a comprehensive predictive methodology for wave height prediction by integrating novel Takagi–Sugeno–Kang fuzzy models within a multiresolution analysis framework. The multiresolution analysis emerges via wavelets, since they are prominent models characterised by their inherent multiresolution nature. The maximal overlap discrete wavelet transform is utilised to generate the detail and resolution components of the time series, resulting from this multiresolution analysis. The novelty of the proposed model lies on its hybrid training approach, which combines least squares with AdaBound, a gradient-based algorithm derived from the deep learning literature. Significant wave height prediction is studied as a time series problem, hence, the appropriate inputs to the model are selected by developing a surrogate-based wrapped algorithm. The developed wrapper-based algorithm, employs Bayesian optimisation to deliver a fast and accurate method for feature selection. In addition, we introduce a projection step, to further refine the approximation capabilities of the resulting predictive system. The proposed methodology is applied to a real-world time series pertaining to spectral wave height and obtained from the Poseidon operational oceanography system at the Institute of Oceanography, part of the Hellenic Center for Marine Research. Numerical studies showcase a high degree of approximation performance. The predictive scheme with the projection step yields a coefficient of determination of 0.9991, indicating a high level of accuracy. Furthermore, it outperforms the second-best comparative model by approximately 49% in terms of root mean squared error. Comparative evaluations against powerful artificial intelligence models, using regression metrics and hypothesis test, underscore the effectiveness of the proposed methodology. Full article
(This article belongs to the Special Issue Applications of Mathematics in Neural Networks and Machine Learning)
Show Figures

Figure 1

31 pages, 5644 KiB  
Article
Mitigation Technique Using a Hybrid Energy Storage and Time-of-Use (TOU) Approach in Photovoltaic Grid Connection
by Mohammad Reza Maghami, Jagadeesh Pasupuleti, Arthur G. O. Mutambara and Janaka Ekanayake
Technologies 2025, 13(8), 339; https://doi.org/10.3390/technologies13080339 - 5 Aug 2025
Abstract
This study investigates the impact of Time-of-Use (TOU) scheduling and battery energy storage systems (BESS) on voltage stability in a typical Malaysian medium-voltage distribution network with high photovoltaic (PV) system penetration. The analyzed network comprises 110 nodes connected via eight feeders to a [...] Read more.
This study investigates the impact of Time-of-Use (TOU) scheduling and battery energy storage systems (BESS) on voltage stability in a typical Malaysian medium-voltage distribution network with high photovoltaic (PV) system penetration. The analyzed network comprises 110 nodes connected via eight feeders to a pair of 132/11 kV, 15 MVA transformers, supplying a total load of 20.006 MVA. Each node is integrated with a 100 kW PV system, enabling up to 100% PV penetration scenarios. A hybrid mitigation strategy combining TOU-based load shifting and BESS was implemented to address voltage violations occurring, particularly during low-load night hours. Dynamic simulations using DIgSILENT PowerFactory were conducted under worst-case (no load and peak load) conditions. The novelty of this research is the use of real rural network data to validate a hybrid BESS–TOU strategy, supported by detailed sensitivity analysis across PV penetration levels. This provides practical voltage stabilization insights not shown in earlier studies. Results show that at 100% PV penetration, TOU or BESS alone are insufficient to fully mitigate voltage drops. However, a hybrid application of 0.4 MWh BESS with 20% TOU load shifting eliminates voltage violations across all nodes, raising the minimum voltage from 0.924 p.u. to 0.951 p.u. while reducing active power losses and grid dependency. A sensitivity analysis further reveals that a 60% PV penetration can be supported reliably using only 0.4 MWh of BESS and 10% TOU. Beyond this, hybrid mitigation becomes essential to maintain stability. The proposed solution demonstrates a scalable approach to enable large-scale PV integration in dense rural grids and addresses the specific operational characteristics of Malaysian networks, which differ from commonly studied IEEE test systems. This work fills a critical research gap by using real local data to propose and validate practical voltage mitigation strategies. Full article
Show Figures

Figure 1

14 pages, 881 KiB  
Article
Fine-Tuning BiomedBERT with LoRA and Pseudo-Labeling for Accurate Drug–Drug Interactions Classification
by Ioan-Flaviu Gheorghita, Vlad-Ioan Bocanet and Laszlo Barna Iantovics
Appl. Sci. 2025, 15(15), 8653; https://doi.org/10.3390/app15158653 - 5 Aug 2025
Viewed by 27
Abstract
In clinical decision support systems (CDSSs), where accurate classification of drug–drug interactions (DDIs) can directly affect treatment safety and outcomes, identifying drug interactions is a major challenge, introducing a scalable approach for classifying DDIs utilizing a finely-tuned biomedical language model. The method shown [...] Read more.
In clinical decision support systems (CDSSs), where accurate classification of drug–drug interactions (DDIs) can directly affect treatment safety and outcomes, identifying drug interactions is a major challenge, introducing a scalable approach for classifying DDIs utilizing a finely-tuned biomedical language model. The method shown here uses BiomedBERT, a domain-specific version of bidirectional encoder representations from transformers (BERT) that was pre-trained on biomedical literature, to reduce the number of resources needed during fine-tuning. Low-rank adaptation (LoRA) was used to fine-tune the model on the DrugBank dataset. The objective was to classify DDIs into two clinically distinct categories, that is, synergistic and antagonistic interactions. A pseudo-labeling strategy was created to deal with the problem of not having enough labeled data. A curated ground-truth dataset was constructed using polarity-labeled interaction entries from DrugComb and verified DrugBank antagonism pairs. The fine-tuned model is used to figure out what kinds of interactions there are in the rest of the unlabeled data. A checkpointing system saves predictions and confidence scores in small pieces, which means that the process can be continued and is not affected by system crashes. The framework is designed to log every prediction it makes, allowing results to be refined later, either manually or through automated updates, without discarding low-confidence cases, as traditional threshold-based methods often do. The method keeps a record of every output it generates, making it easier to revisit earlier predictions, either by experts or with improved tools, without depending on preset confidence cutoffs. It was built with efficiency in mind, so it can handle large amounts of biomedical text without heavy computational demands. Rather than focusing on model novelty, this research demonstrates how existing biomedical transformers can be adapted to polarity-aware DDI classification with minimal computational overhead, emphasizing deployment feasibility and clinical relevance. Full article
Show Figures

Figure 1

Back to TopTop