Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (12,070)

Search Parameters:
Keywords = Decision-making approach

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 514 KiB  
Article
Fuzzy Hypothesis Testing for Radar Detection: A Statistical Approach for Reducing False Alarm and Miss Probabilities
by Ahmed K. Elsherif, Hanan Haj Ahmad, Mohamed Aboshady and Basma Mostafa
Mathematics 2025, 13(14), 2299; https://doi.org/10.3390/math13142299 (registering DOI) - 17 Jul 2025
Abstract
This paper addresses a fundamental challenge in statistical radar detection systems: optimizing the trade-off between the probability of a false alarm (PFA) and the probability of a miss (PM). These two metrics are inversely related and [...] Read more.
This paper addresses a fundamental challenge in statistical radar detection systems: optimizing the trade-off between the probability of a false alarm (PFA) and the probability of a miss (PM). These two metrics are inversely related and critical for performance evaluation. Traditional detection approaches often enhance one aspect at the expense of the other, limiting their practical applicability. To overcome this limitation, a fuzzy hypothesis testing framework is introduced that improves decision making under uncertainty by incorporating both crisp and fuzzy data representations. The methodology is divided into three phases. In the first phase, we reduce the probability of false alarm PFA while maintaining a constant probability of miss PM using crisp data characterized by deterministic values and classical statistical thresholds. In the second phase, the inverse scenario is considered: minimizing PM while keeping PFA fixed. This is achieved through parameter tuning and refined threshold calibration. In the third phase, a strategy is developed to simultaneously enhance both PFA and PM, despite their inverse correlation, by adopting adaptive decision rules. To further strengthen system adaptability, fuzzy data are introduced, which effectively model imprecision and ambiguity. This enhances robustness, particularly in scenarios where rapid and accurate classification is essential. The proposed methods are validated through both real and synthetic simulations of radar measurements, demonstrating their ability to enhance detection reliability across diverse conditions. The findings confirm the applicability of fuzzy hypothesis testing for modern radar systems in both civilian and military contexts, providing a statistically sound and operationally applicable approach for reducing detection errors and optimizing system performance. Full article
(This article belongs to the Special Issue New Advance in Applied Probability and Statistical Inference)
13 pages, 687 KiB  
Article
Turkish Chest X-Ray Report Generation Model Using the Swin Enhanced Yield Transformer (Model-SEY) Framework
by Murat Ucan, Buket Kaya and Mehmet Kaya
Diagnostics 2025, 15(14), 1805; https://doi.org/10.3390/diagnostics15141805 - 17 Jul 2025
Abstract
Background/Objectives: Extracting meaningful medical information from chest X-ray images and transcribing it into text is a complex task that requires a high level of expertise and directly affects clinical decision-making processes. Automatic reporting systems for this field in Turkish represent an important [...] Read more.
Background/Objectives: Extracting meaningful medical information from chest X-ray images and transcribing it into text is a complex task that requires a high level of expertise and directly affects clinical decision-making processes. Automatic reporting systems for this field in Turkish represent an important gap in scientific research, as they have not been sufficiently addressed in the existing literature. Methods: A deep learning-based approach called Model-SEY was developed with the aim of automatically generating Turkish medical reports from chest X-ray images. The Swin Transformer structure was used in the encoder part of the model to extract image features, while the text generation process was carried out using the cosmosGPT architecture, which was adapted specifically for the Turkish language. Results: With the permission of the ethics committee, a new dataset was created using image–report pairs obtained from Elazıg Fethi Sekin City Hospital and Indiana University Chest X-Ray dataset and experiments were conducted on this new dataset. In the tests conducted within the scope of the study, scores of 0.6412, 0.5335, 0.4395, 0.4395, 0.3716, and 0.2240 were obtained in BLEU-1, BLEU-2, BLEU-3, BLEU-4, and ROUGE word overlap evaluation metrics, respectively. Conclusions: Quantitative and qualitative analyses of medical reports autonomously generated by the proposed model have shown that they are meaningful and consistent. The proposed model is one of the first studies in the field of autonomous reporting using deep learning architectures specific to the Turkish language, representing an important step forward in this field. It will also reduce potential human errors during diagnosis by supporting doctors in their decision-making. Full article
(This article belongs to the Special Issue Artificial Intelligence for Health and Medicine)
19 pages, 3923 KiB  
Article
Automated Aneurysm Boundary Detection and Volume Estimation Using Deep Learning
by Alireza Bagheri Rajeoni, Breanna Pederson, Susan M. Lessner and Homayoun Valafar
Diagnostics 2025, 15(14), 1804; https://doi.org/10.3390/diagnostics15141804 - 17 Jul 2025
Abstract
Background/Objective: Precise aneurysm volume measurement offers a transformative edge for risk assessment and treatment planning in clinical settings. Currently, clinical assessments rely heavily on manual review of medical imaging, a process that is time-consuming and prone to inter-observer variability. The widely accepted standard [...] Read more.
Background/Objective: Precise aneurysm volume measurement offers a transformative edge for risk assessment and treatment planning in clinical settings. Currently, clinical assessments rely heavily on manual review of medical imaging, a process that is time-consuming and prone to inter-observer variability. The widely accepted standard of care primarily focuses on measuring aneurysm diameter at its widest point, providing a limited perspective on aneurysm morphology and lacking efficient methods to measure aneurysm volumes. Yet, volume measurement can offer deeper insight into aneurysm progression and severity. In this study, we propose an automated approach that leverages the strengths of pre-trained neural networks and expert systems to delineate aneurysm boundaries and compute volumes on an unannotated dataset from 60 patients. The dataset includes slice-level start/end annotations for aneurysm but no pixel-wise aorta segmentations. Method: Our method utilizes a pre-trained UNet to automatically locate the aorta, employs SAM2 to track the aorta through vascular irregularities such as aneurysms down to the iliac bifurcation, and finally uses a Long Short-Term Memory (LSTM) network or expert system to identify the beginning and end points of the aneurysm within the aorta. Results: Despite no manual aorta segmentation, our approach achieves promising accuracy, predicting the aneurysm start point with an R2 score of 71%, the end point with an R2 score of 76%, and the volume with an R2 score of 92%. Conclusions: This technique has the potential to facilitate large-scale aneurysm analysis and improve clinical decision-making by reducing dependence on annotated datasets. Full article
(This article belongs to the Section Machine Learning and Artificial Intelligence in Diagnostics)
Show Figures

Figure 1

18 pages, 2446 KiB  
Review
Thematic Fragmentation and Convergence in Urban Flood Simulation Research: A 45-Year Bibliometric Mapping
by Ahmad Gamal, Mohammad Raditia Pradana, Bambang Hari Wibisono, Prananda Navitas and Jagannath Aryal
Urban Sci. 2025, 9(7), 280; https://doi.org/10.3390/urbansci9070280 - 17 Jul 2025
Abstract
Urban flooding presents a growing challenge amid rapid urbanization, climate variability, and fragmented governance. Although simulation and risk assessment tools have advanced considerably, their integration into urban planning remains limited. This study utilized a comprehensive bibliometric analysis of 1293 articles from the Scopus [...] Read more.
Urban flooding presents a growing challenge amid rapid urbanization, climate variability, and fragmented governance. Although simulation and risk assessment tools have advanced considerably, their integration into urban planning remains limited. This study utilized a comprehensive bibliometric analysis of 1293 articles from the Scopus database, selected through a PRISMA-guided workflow, to examine the temporal, structural, and conceptual evolution of simulation, flood risk, and planning in urban flood research from 1980 to 2025. The findings reveal a thematic progression from engineering-centric approaches to broader discourses on resilience, adaptation, and systemic risk. However, disciplinary fragmentation persists, with technical modeling, infrastructure planning, and governance still weakly connected. Despite a shared vocabulary around climate risk and resilience, practical integration into decision-making frameworks remains underdeveloped. The study highlights the need for more cohesive research-practice linkages and calls for frameworks that better align simulation outputs with urban planning imperatives. Full article
Show Figures

Figure 1

22 pages, 1718 KiB  
Review
A Review on Risk and Reliability Analysis in Photovoltaic Power Generation
by Ahmad Zaki Abdul Karim, Mohamad Shaiful Osman and Mohd. Khairil Rahmat
Energies 2025, 18(14), 3790; https://doi.org/10.3390/en18143790 - 17 Jul 2025
Abstract
Precise evaluation of risk and reliability is crucial for decision making and predicting the outcome of investment in a photovoltaic power system (PVPS) due to its intermittent source. This paper explores different methodologies for risk evaluation and reliability assessment, which can be categorized [...] Read more.
Precise evaluation of risk and reliability is crucial for decision making and predicting the outcome of investment in a photovoltaic power system (PVPS) due to its intermittent source. This paper explores different methodologies for risk evaluation and reliability assessment, which can be categorized into qualitative, quantitative, and hybrid qualitative and quantitative (HQQ) approaches. Qualitative methods include failure mode analysis, graphical analysis, and hazard analysis, while quantitative methods include analytical methods, stochastic methods, Bayes’ theorem, reliability optimization, multi-criteria analysis, and data utilization. HQQ methodology combines table-based and visual analysis methods. Currently, reliability assessment techniques such as mean time between failures (MTBF), system average interruption frequency index (SAIFI), and system average interruption duration index (SAIDI) are commonly used to predict PVPS performance. However, alternative methods such as economical metrics like the levelized cost of energy (LCOE) and net present value (NPV) can also be used. Therefore, a risk and reliability approach should be applied together to improve the accuracy of predicting significant aspects in the photovoltaic industry. Full article
(This article belongs to the Section B: Energy and Environment)
Show Figures

Figure 1

23 pages, 852 KiB  
Article
Open Data to Promote the Economic and Commercial Development of the Housing Sector: The Case of Spain
by Ricardo Curto-Rodríguez, Rafael Marcos-Sánchez, Alicia Zaragoza-Benzal and Daniel Ferrández
Urban Sci. 2025, 9(7), 277; https://doi.org/10.3390/urbansci9070277 - 17 Jul 2025
Abstract
Data is the starting point for generating information and knowledge in the decision-making process. Open data, which is information disclosed free of charge through open licenses and reusable formats, has great potential for value creation. Therefore, the objective of this research is to [...] Read more.
Data is the starting point for generating information and knowledge in the decision-making process. Open data, which is information disclosed free of charge through open licenses and reusable formats, has great potential for value creation. Therefore, the objective of this research is to evaluate Spanish autonomous communities’ open data initiatives in a category of information of vital importance: housing. The methodology employed was a population analysis of datasets labeled as housing, followed by a necessary data cleansing process due to the identification of various errors, which reduced the number of labeled datasets from 1000 to 599. Only 12 of the 17 autonomous communities provided this type of information. The analysis of the results reveals that autonomous communities’ approaches to open data initiatives are highly heterogeneous and that the supply is irregular, with the Basque Country accounting for 70% of the datasets considered in the research. The creation of an indicator that equally assesses the existence of information and file formats (breadth and reusability) continues to identify the Basque Country as the undisputed leader, with Catalonia and Cantabria in second and third place, the only autonomous communities to exceed 50 points out of a possible 100. The study concludes by highlighting that the lack of uniformity in the formulation and implementation of open data policies will limit the use of information and, consequently, its value. Therefore, a series of recommendations is issued in this regard. Full article
Show Figures

Figure 1

17 pages, 10396 KiB  
Article
Feature Selection Based on Three-Dimensional Correlation Graphs
by Adam Dudáš and Aneta Szoliková
AppliedMath 2025, 5(3), 91; https://doi.org/10.3390/appliedmath5030091 - 17 Jul 2025
Abstract
The process of feature selection is a critical component of any decision-making system incorporating machine or deep learning models applied to multidimensional data. Feature selection on input data can be performed using a variety of techniques, such as correlation-based methods, wrapper-based methods, or [...] Read more.
The process of feature selection is a critical component of any decision-making system incorporating machine or deep learning models applied to multidimensional data. Feature selection on input data can be performed using a variety of techniques, such as correlation-based methods, wrapper-based methods, or embedded methods. However, many conventionally used approaches do not support backwards interpretability of the selected features, making their application in real-world scenarios impractical and difficult to implement. This work addresses that limitation by proposing a novel correlation-based strategy for feature selection in regression tasks, based on a three-dimensional visualization of correlation analysis results—referred to as three-dimensional correlation graphs. The main objective of this study is the design, implementation, and experimental evaluation of this graphical model through a case study using a multidimensional dataset with 28 attributes. The experiments assess the clarity of the visualizations and their impact on regression model performance, demonstrating that the approach reduces dimensionality while maintaining or improving predictive accuracy, enhances interpretability by uncovering hidden relationships, and achieves better or comparable results to conventional feature selection methods. Full article
Show Figures

Figure 1

5 pages, 488 KiB  
Proceeding Paper
Digital Twins for Circular Economy Optimization: A Framework for Sustainable Engineering Systems
by Shubham Gupta
Proceedings 2025, 121(1), 4; https://doi.org/10.3390/proceedings2025121004 - 16 Jul 2025
Abstract
This paper introduces sustainable engineering systems built using digital twin technology and circular economy principles. This research presents a framework for monitoring, modeling, and making decisions in real timusing virtual replicas of physical products, processes, and systems in product lifecycles. A digital twin [...] Read more.
This paper introduces sustainable engineering systems built using digital twin technology and circular economy principles. This research presents a framework for monitoring, modeling, and making decisions in real timusing virtual replicas of physical products, processes, and systems in product lifecycles. A digital twin was used to show that through a digital twin, waste was reduced by 27%, energy consumption was reduced by 32%, and the resource recovery rate increased to 45%. The proposed approach under the framework employs various machine learning algorithms, IoT sensor networks, and advanced data analytics to support closed-loop flows of materials. The results show how digital twins can enhance progress toward the goals the circular economy sets to identify inefficiencies, predict maintenance needs, and optimize the use of resources. This integration is a promising industry approach that will introduce more sustainable operations and maintain economic viability. Full article
Show Figures

Figure 1

19 pages, 1196 KiB  
Article
The Effects of Landmark Salience on Drivers’ Spatial Cognition and Takeover Performance in Autonomous Driving Scenarios
by Xianyun Liu, Yongdong Zhou and Yunhong Zhang
Behav. Sci. 2025, 15(7), 966; https://doi.org/10.3390/bs15070966 - 16 Jul 2025
Abstract
With the increasing prevalence of autonomous vehicles (AVs), drivers’ spatial cognition and takeover performance have become critical to traffic safety. This study investigates the effects of landmark salience—specifically visual and structural salience—on drivers’ spatial cognition and takeover behavior in autonomous driving scenarios. Two [...] Read more.
With the increasing prevalence of autonomous vehicles (AVs), drivers’ spatial cognition and takeover performance have become critical to traffic safety. This study investigates the effects of landmark salience—specifically visual and structural salience—on drivers’ spatial cognition and takeover behavior in autonomous driving scenarios. Two simulator-based experiments were conducted. Experiment 1 examined the impact of landmark salience on spatial cognition tasks, including route re-cruise, scene recognition, and sequence recognition. Experiment 2 assessed the effects of landmark salience on takeover performance. Results indicated that salient landmarks generally enhance spatial cognition; the effects of visual and structural salience differ in scope and function in autonomous driving scenarios. Landmarks with high visual salience not only improved drivers’ accuracy in making intersection decisions but also significantly reduced the time it took to react to a takeover. In contrast, structurally salient landmarks had a more pronounced effect on memory-based tasks, such as scene recognition and sequence recognition, but showed a limited influence on dynamic decision-making tasks like takeover response. These findings underscore the differentiated roles of visual and structural landmark features, highlighting the critical importance of visually salient landmarks in supporting both navigation and timely takeover during autonomous driving. The results provide practical insights for urban road design, advocating for the strategic placement of visually prominent landmarks at key decision points. This approach has the potential to enhance both navigational efficiency and traffic safety. Full article
(This article belongs to the Section Cognition)
Show Figures

Figure 1

32 pages, 5175 KiB  
Article
Scheduling and Routing of Device Maintenance for an Outdoor Air Quality Monitoring IoT
by Peng-Yeng Yin
Sustainability 2025, 17(14), 6522; https://doi.org/10.3390/su17146522 - 16 Jul 2025
Abstract
Air quality monitoring IoT is one of the approaches to achieving a sustainable future. However, the large area of IoT and the high number of monitoring microsites pose challenges for device maintenance to guarantee quality of service (QoS) in monitoring. This paper proposes [...] Read more.
Air quality monitoring IoT is one of the approaches to achieving a sustainable future. However, the large area of IoT and the high number of monitoring microsites pose challenges for device maintenance to guarantee quality of service (QoS) in monitoring. This paper proposes a novel maintenance programming model for a large-area IoT containing 1500 monitoring microsites. In contrast to classic device maintenance, the addressed programming scenario considers the division of appropriate microsites into batches, the determination of the batch maintenance date, vehicle routing for the delivery of maintenance services, and a set of hard constraints such as QoS in air quality monitoring, the maximum number of labor working hours, and an upper limit on the total CO2 emissions. Heuristics are proposed to generate the batches of microsites and the scheduled maintenance date for the batches. A genetic algorithm is designed to find the shortest routes by which to visit the batch microsites by a fleet of vehicles. Simulations are conducted based on government open data. The experimental results show that the maintenance and transportation costs yielded by the proposed model grow linearly with the number of microsites if the fleet size is also linearly related to the microsite number. The mean time between two consecutive cycles is around 17 days, which is generally sufficient for the preparation of the required maintenance materials and personnel. With the proposed method, the decision-maker can circumvent the difficulties in handling the hard constraints, and the allocation of maintenance resources, including budget, materials, and engineering personnel, is easier to manage. Full article
(This article belongs to the Section Sustainable Engineering and Science)
Show Figures

Figure 1

20 pages, 927 KiB  
Article
An Optimization Model with “Perfect Rationality” for Expert Weight Determination in MAGDM
by Yuetong Liu, Chaolang Hu, Shiquan Zhang and Qixiao Hu
Mathematics 2025, 13(14), 2286; https://doi.org/10.3390/math13142286 - 16 Jul 2025
Abstract
Given the evaluation data of all the experts in multi-attribute group decision making, this paper establishes an optimization model for learning and determining expert weights based on minimizing the sum of the differences between the individual evaluation and the overall consistent evaluation results. [...] Read more.
Given the evaluation data of all the experts in multi-attribute group decision making, this paper establishes an optimization model for learning and determining expert weights based on minimizing the sum of the differences between the individual evaluation and the overall consistent evaluation results. The paper proves the uniqueness of the solution of the optimization model and rigorously proves that the expert weights obtained by the model have “perfect rationality”, i.e., the weights are inversely proportional to the distance to the “overall consistent scoring point”. Based on the above characteristics, the optimization problem is further transformed into solving a system of nonlinear equations to obtain the expert weights. Finally, numerical experiments are conducted to verify the rationality of the model and the feasibility of transforming the problem into a system of nonlinear equations. Numerical experiments demonstrate that the deviation metric for the expert weights produced by our optimization model is significantly lower than that obtained under equal weighting or the entropy weight method, and it approaches zero. Within numerical tolerance, this confirms the model’s “perfect rationality”. Furthermore, the weights determined by solving the corresponding nonlinear equations coincide exactly with the optimization solution, indicating that a dedicated algorithm grounded in perfect rationality can directly solve the model. Full article
Show Figures

Figure 1

21 pages, 854 KiB  
Review
Non-Invasive Ventilation: When, Where, How to Start, and How to Stop
by Mary Zimnoch, David Eldeiry, Oluwabunmi Aruleba, Jacob Schwartz, Michael Avaricio, Oki Ishikawa, Bushra Mina and Antonio Esquinas
J. Clin. Med. 2025, 14(14), 5033; https://doi.org/10.3390/jcm14145033 - 16 Jul 2025
Abstract
Non-invasive ventilation (NIV) is a cornerstone in the management of acute and chronic respiratory failure, offering critical support without the risks of intubation. However, successful weaning from NIV remains a complex, high-stakes process. Poorly timed or improperly executed weaning significantly increases morbidity and [...] Read more.
Non-invasive ventilation (NIV) is a cornerstone in the management of acute and chronic respiratory failure, offering critical support without the risks of intubation. However, successful weaning from NIV remains a complex, high-stakes process. Poorly timed or improperly executed weaning significantly increases morbidity and mortality, yet current clinical practice often relies on subjective judgment rather than evidence-based protocols. This manuscript reviews the current landscape of NIV weaning, emphasizing structured approaches, objective monitoring, and predictors of weaning success or failure. It examines guideline-based indications, monitoring strategies, and various weaning techniques—gradual and abrupt—with evidence of their efficacy across different patient populations. Predictive tools such as the Rapid Shallow Breathing Index, Lung Ultrasound Score, Diaphragm Thickening Fraction, ROX index, and HACOR score are analyzed for their diagnostic value. Additionally, this review underscores the importance of care setting—ICU, step-down unit, or general ward—and how it influences outcomes. Finally, it highlights critical gaps in research, especially around weaning in non-ICU environments. By consolidating current evidence and identifying predictors and pitfalls, this article aims to support clinicians in making safe, timely, and patient-specific NIV weaning decisions. In the current literature, there are gaps regarding patient selection and lack of universal protocolization for initiation and de-escalation of NIV as the data has been scattered. This review aims to consolidate the relevant information to be utilized by clinicians throughout multiple levels of care in all hospital systems. Full article
Show Figures

Figure 1

42 pages, 2145 KiB  
Article
Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
by Maxim Majlatow, Fahim Ahmed Shakil, Andreas Emrich and Nijat Mehdiyev
Appl. Sci. 2025, 15(14), 7925; https://doi.org/10.3390/app15147925 - 16 Jul 2025
Abstract
In high-stakes decision-making environments, predictive models must deliver not only high accuracy but also reliable uncertainty estimations and transparent explanations. This study explores the integration of probability calibration techniques with Conformal Prediction (CP) within a predictive process monitoring (PPM) framework tailored to healthcare [...] Read more.
In high-stakes decision-making environments, predictive models must deliver not only high accuracy but also reliable uncertainty estimations and transparent explanations. This study explores the integration of probability calibration techniques with Conformal Prediction (CP) within a predictive process monitoring (PPM) framework tailored to healthcare analytics. CP is renowned for its distribution-free prediction regions and formal coverage guarantees under minimal assumptions; however, its practical utility critically depends on well-calibrated probability estimates. We compare a range of post-hoc calibration methods—including parametric approaches like Platt scaling and Beta calibration, as well as non-parametric techniques such as Isotonic Regression and Spline calibration—to assess their impact on aligning raw model outputs with observed outcomes. By incorporating these calibrated probabilities into the CP framework, our multilayer analysis evaluates improvements in prediction region validity, including tighter coverage gaps and reduced minority error contributions. Furthermore, we employ SHAP-based explainability to explain how calibration influences feature attribution for both high-confidence and ambiguous predictions. Experimental results on process-driven healthcare data indicate that the integration of calibration with CP not only enhances the statistical robustness of uncertainty estimates but also improves the interpretability of predictions, thereby supporting safer and robust clinical decision-making. Full article
(This article belongs to the Special Issue Digital Innovations in Healthcare)
Show Figures

Figure 1

12 pages, 305 KiB  
Article
Discrepancies in Recommendations on Pharmacokinetic Drug Interactions for Anticancer Medications and Direct Oral Anticoagulants (DOAC): A Comparative Analysis of Different Clinical Decision Support Systems and Sources
by Karolina Nowinski and Roza Chaireti
Pharmaceuticals 2025, 18(7), 1044; https://doi.org/10.3390/ph18071044 - 16 Jul 2025
Abstract
Background/objectives: In some cases of concomitant use of direct oral anticoagulants (DOAC) and certain anticancer medications, pharmacokinetic interactions are expected; however, clinical data is scarce. This report reviews the recommendations on the use of DOAC concurrently with anticancer drugs according to different clinical [...] Read more.
Background/objectives: In some cases of concomitant use of direct oral anticoagulants (DOAC) and certain anticancer medications, pharmacokinetic interactions are expected; however, clinical data is scarce. This report reviews the recommendations on the use of DOAC concurrently with anticancer drugs according to different clinical decision support systems and sources, with a focus on discrepancies. Methods: We reviewed the recommendations from the American Heart Association (AHA), European Heart Rhythm Association (EHRA), summary of product characteristics (SPC) in FASS (Swedish medicine information portal), the Swedish clinical decision support system Janusmed, and information from the Food and Drug Administration (FDA) on the concomitant use of apixaban, edoxaban, and rivaroxaban (activated factor X (FXa) inhibitors) and 80 anticancer drugs from 11 categories (240 drug pairs). Results: No warnings of expected pharmacokinetic drug interactions between FXa inhibitors and anticancer drugs were found for 155 drug pairs (65%) across all sources. The remaining 35% of drug pairs were flagged as having possible interactions with FXa inhibitors according to at least one source. Discrepancies in the recommendations from the different sources were reported. The reported discrepancies were, for the most part, associated with different assessments of the mechanism and the extent of pharmacokinetic interactions of each anticancer medication. Also, knowledge sources have different approaches to reporting potential interactions, in some cases reporting clinically relevant strictly pharmacokinetic interactions, whereas others include even patient-specific factors. Conclusions: The lack of clinical data and different recommendations can make clinical decisions on the concomitant use of DOAC and anticancer drugs difficult. Our compilation is meant to assist clinicians in making decisions based on the available evidence, even if it is scarce. Full article
Show Figures

Graphical abstract

18 pages, 1583 KiB  
Article
Developing a Dynamic Simulation Model for Point-of-Care Ultrasound Assessment and Learning Curve Analysis
by Sandra Usaquén-Perilla, Laura Valentina Bocanegra-Villegas and Jose Isidro García-Melo
Systems 2025, 13(7), 591; https://doi.org/10.3390/systems13070591 - 16 Jul 2025
Abstract
The development of new diagnostic technologies is accelerating, and budgetary constraints in the health sector necessitate a systematic decision-making process to acquire emerging technologies. Health Technology Assessment methodologies integrate technology, clinical efficacy, patient safety, and organizational and financial factors in this context. However, [...] Read more.
The development of new diagnostic technologies is accelerating, and budgetary constraints in the health sector necessitate a systematic decision-making process to acquire emerging technologies. Health Technology Assessment methodologies integrate technology, clinical efficacy, patient safety, and organizational and financial factors in this context. However, these methodologies do not include the learning curve, a critical factor in operator-dependent technologies. This study presents an evaluation model incorporating the learning curve, developed from the domains of the AdHopHTA project. Using System Dynamics (SD), the model was validated and calibrated as a case study to evaluate the use of Point-of-Care Ultrasound (POCUS) in identifying dengue. This approach allowed for the analysis of the impact of the learning curve and patient demand on the revenues and costs of the healthcare system and the cost–benefit indicator associated with dengue detection. The model assesses physician competency and how different training strategies and frequencies of use affect POCUS adoption. The findings underscore the importance of integrating the learning curve into decision-making. This study highlights the need for further investigation into the barriers that limit the effective use of POCUS, particularly in resource-limited settings. It proposes a framework to improve the integration of this technology into clinical practice for early dengue detection. Full article
(This article belongs to the Special Issue System Dynamics Modeling and Simulation for Public Health)
Show Figures

Figure 1

Back to TopTop