Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (389)

Search Parameters:
Keywords = scientific transparency

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 1268 KB  
Article
Sustainable Development of Smart Regions via Cybersecurity of National Infrastructure: A Fuzzy Risk Assessment Approach
by Oleksandr Korchenko, Oleksandr Korystin, Volodymyr Shulha, Svitlana Kazmirchuk, Serhii Demediuk and Serhii Zybin
Sustainability 2025, 17(19), 8757; https://doi.org/10.3390/su17198757 - 29 Sep 2025
Abstract
This article proposes a scientifically grounded approach to risk assessment for infrastructural and functional systems that underpin the development of digitally transformed regional territories under conditions of high threat dynamics and sociotechnical instability. The core methodology is based on modeling of multifactorial threats [...] Read more.
This article proposes a scientifically grounded approach to risk assessment for infrastructural and functional systems that underpin the development of digitally transformed regional territories under conditions of high threat dynamics and sociotechnical instability. The core methodology is based on modeling of multifactorial threats through the application of fuzzy set theory and logic–linguistic analysis, enabling consideration of parameter uncertainty, fragmented expert input, and the lack of a unified risk landscape within complex infrastructure environments. A special emphasis is placed on components of technogenic, informational, and mobile infrastructure that ensure regional viability across planning, response, and recovery phases. The results confirm the relevance of the approach for assessing infrastructure resilience risks in regional spatial–functional systems, which demonstrates the potential integration into sustainable development strategies at the level of regional governance, cross-sectoral planning, and cultural reevaluation of the role of analytics as an ethically grounded practice for cultivating trust, transparency, and professional maturity. Full article
19 pages, 1025 KB  
Article
Research on Trade Credit Risk Assessment for Foreign Trade Enterprises Based on Explainable Machine Learning
by Mengjie Liao, Wanying Jiao and Jian Zhang
Information 2025, 16(10), 831; https://doi.org/10.3390/info16100831 - 26 Sep 2025
Abstract
As global economic integration deepens, import and export trade plays an increasingly vital role in China’s economy. To enhance regulatory efficiency and achieve scientific, transparent credit supervision, this study proposes a trade credit risk evaluation model based on interpretable machine learning, incorporating loss [...] Read more.
As global economic integration deepens, import and export trade plays an increasingly vital role in China’s economy. To enhance regulatory efficiency and achieve scientific, transparent credit supervision, this study proposes a trade credit risk evaluation model based on interpretable machine learning, incorporating loss preferences. Key risk features are identified through a comprehensive interpretability framework combining SHapley Additive exPlanations (SHAP) and Local Interpretable Model-agnostic Explanations (LIME), forming an optimal feature subset. Using Light Gradient Boosting Machine (LightGBM) as the base model, a weight adjustment strategy is introduced to reduce costly misclassification of high-risk enterprises, effectively improving their recognition rate. However, this adjustment leads to a decline in overall accuracy. To address this trade-off, a Bagging ensemble framework is applied, which restores and slightly improves accuracy while maintaining low misclassification costs. Experimental results demonstrate that the interpretability framework improves transparency and business applicability, the weight adjustment strategy enhances high-risk enterprise detection, and Bagging balances the overall classification performance. The proposed method ensures reliable identification of high-risk enterprises while preserving overall model robustness, thereby providing strong practical value for enterprise credit risk assessment and decision-making. Full article
Show Figures

Figure 1

27 pages, 1722 KB  
Article
Same Coin, Different Value: A Multi-Year Comparative Analysis of Financial Performance of Open Access and Legacy Publishers
by George Peppas, Leonidas Papachristopoulos and Giannis Tsakonas
Publications 2025, 13(4), 46; https://doi.org/10.3390/publications13040046 - 24 Sep 2025
Viewed by 309
Abstract
We are living in an era where the demand for Open Access to knowledge is growing and the need for transparency in scientific publishing is becoming imperative. The question that arises at this stage is whether openness in knowledge constitutes the Achilles heel [...] Read more.
We are living in an era where the demand for Open Access to knowledge is growing and the need for transparency in scientific publishing is becoming imperative. The question that arises at this stage is whether openness in knowledge constitutes the Achilles heel of the once profitable legacy publishing industry or whether it is the Trojan horse of the latter for increasing its revenues. At the same time, the question of whether Open Access publishers can ensure their sustainability through this model remains unanswered. This study implements a multi-year analysis (2019–2023) comparing the performance of Open Access and legacy publishers. Using a set of financial ratios—grouped by profitability, liquidity, efficiency, and solvency, as well as data on firm size (revenues, assets, and employee counts), we assess their financial performance. The results indicate that legacy publishers have enormous scale, stable profitability, and high leverage, but low liquidity and return on equity. On the other hand, OA publishers, although smaller, have higher returns, better liquidity, and almost zero borrowing, but with greater annual volatility. The study discusses that OA publishers, despite their small size, can be as profitable as or even more profitable than traditional publishers, thanks to flexible structures and fast cash flows, but remain vulnerable due to limited resources and the risk of acquisition. Furthermore, legacy publishers maintain their dominance by leveraging their scale, strong brands, and investment capacity while adopting or acquiring OA models, creating a competitive environment where scale and strategic differentiation are decisive. Full article
Show Figures

Figure 1

24 pages, 1523 KB  
Review
Mapping the Intersection of Entrepreneurship, Digitalization, and the SDGs: A Scopus-Based Literature Review
by Panagiota Xanthopoulou and Alexandros Sahinidis
Sustainability 2025, 17(18), 8420; https://doi.org/10.3390/su17188420 - 19 Sep 2025
Viewed by 241
Abstract
This study examines the dynamic interconnection between entrepreneurship, digital transformation, and the Sustainable Development Goals (SDGs), placing the research question in a broader socio-technological context. The research was based on a descriptive review of 22 empirical studies drawn from the Scopus database and [...] Read more.
This study examines the dynamic interconnection between entrepreneurship, digital transformation, and the Sustainable Development Goals (SDGs), placing the research question in a broader socio-technological context. The research was based on a descriptive review of 22 empirical studies drawn from the Scopus database and related to technological innovations adopted by business initiatives aimed at sustainable development. A total of 314 records were identified, 180 screened, and 22 empirical studies included. Studies originated primarily from Europe and Asia and applied quantitative (60%), qualitative (25%), or mixed methods (15%). Through thematic analysis, the dominant technologies (such as artificial intelligence, blockchain, ERP systems, digital platforms, and ESG data analysis) were identified, as well as the methodological approaches followed in the relevant international literature. The main findings indicate that digital transformation offers significant opportunities to enhance innovation, transparency, and social inclusion. However, challenges also arise, such as digital inequalities, the lack of a strategy for alignment with the SDGs, and institutional weaknesses. In conclusion, the transition to sustainable digital business models requires a multidisciplinary approach, strengthened leadership, educational interventions, and institutional support. This study contributes theoretically and practically to the ongoing scientific dialogue on sustainable and digitally supported entrepreneurship. Full article
Show Figures

Figure 1

26 pages, 20242 KB  
Article
Multi-Source Feature Selection and Explainable Machine Learning Approach for Mapping Nitrogen Balance Index in Winter Wheat Based on Sentinel-2 Data
by Botai Shi, Xiaokai Chen, Yiming Guo, Li Liu, Peng Li and Qingrui Chang
Remote Sens. 2025, 17(18), 3196; https://doi.org/10.3390/rs17183196 - 16 Sep 2025
Viewed by 387
Abstract
The Nitrogen Balance Index is a key indicator of crop nitrogen status, but conventional monitoring methods are invasive, costly, and unsuitable for large-scale application. This study targets early-season winter wheat in the Guanzhong Plain and proposes a framework that integrates Sentinel-2 imagery with [...] Read more.
The Nitrogen Balance Index is a key indicator of crop nitrogen status, but conventional monitoring methods are invasive, costly, and unsuitable for large-scale application. This study targets early-season winter wheat in the Guanzhong Plain and proposes a framework that integrates Sentinel-2 imagery with Sen2Res super-resolution reconstruction, multi-feature optimization, and interpretable machine learning. Super-resolved imagery demonstrated improved spatial detail and enhanced correlations between reflectance, texture, and vegetation indices and the Nitrogen Balance Index compared to native imagery. A two-stage feature-selection strategy, combining correlation analysis and recursive feature elimination, identified a compact set of key variables. Among the tested algorithms, the random forest model achieved the highest accuracy, with R2 = 0.77 and RMSE = 1.57, representing an improvement of about 20% over linear models. Shapley Additive Explanations revealed that red-edge and near-infrared features accounted for up to 75% of predictive contributions, highlighting their physiological relevance to nitrogen metabolism. Overall, this study contributes to the remote sensing of crop nitrogen status through three aspects: (1) integration of super-resolution with feature fusion to overcome coarse spatial resolution, (2) adoption of a two-stage feature optimization strategy to reduce redundancy, and (3) incorporation of interpretable modeling to improve transparency. The proposed framework supports regional-scale NBI monitoring and provides a scientific basis for precision fertilization. Full article
(This article belongs to the Special Issue Perspectives of Remote Sensing for Precision Agriculture)
Show Figures

Figure 1

24 pages, 12935 KB  
Article
Geohazard Susceptibility Assessment in Karst Terrain: A Novel Coupling Model Integrating Information Value and XGBoost Machine Learning in Guizhou Province, China
by Jiao Chen, Fufei Wu and Hongyin Hu
Appl. Sci. 2025, 15(18), 10077; https://doi.org/10.3390/app151810077 - 15 Sep 2025
Viewed by 253
Abstract
In this study, the geological disasters in Guizhou Province serve as the research object, and a systematic susceptibility evaluation is conducted in light of the province’s prominent problems with frequent geological disasters. The current research primarily focuses on the application of a single [...] Read more.
In this study, the geological disasters in Guizhou Province serve as the research object, and a systematic susceptibility evaluation is conducted in light of the province’s prominent problems with frequent geological disasters. The current research primarily focuses on the application of a single model, often with deficiencies in factor interpretation. It has not yet systematically integrated the advantages of the traditional information model and multiple machine learning algorithms, nor introduced interpretable methods to analyze the disaster mechanism deeply. In this study, the information value (IV) model is combined with machine learning algorithms—logistic regression (LR), decision tree (DT), support vector machine (SVM), random forest (RF), and extreme gradient boosting (XGBoost)—to construct a coupling model to evaluate the susceptibility to geological disasters. Combined with the Bayesian optimization algorithm, the geological disaster susceptibility evaluation model is built. The confusion matrix and receiver operating characteristic (ROC) curve were used to evaluate the model’s accuracy. The Shapley Additive exPlanations (SHAP) method is used to quantify the contribution of each influencing factor, thereby improving the transparency and credibility of the model. The results show that the coupling models, especially the IV-XGB model, achieved the best performance (AUC = 0.9448), which significantly identifies the northern Wujiang River Basin and the central karst core area as high-risk areas and clarifies the disaster-causing mechanism of “terrain–hydrology–human activities” coupling. The SHAP method further identified that NDVI, land use type, and elevation were the predominant controlling factors. This study presents a high-precision and interpretable modeling method for assessing susceptibility to geological disasters, providing a scientific basis for disaster prevention and control in Guizhou Province and similar geological conditions. Full article
Show Figures

Figure 1

22 pages, 309 KB  
Article
From Religious Representation to Conceptual Truth: The Role of Religion in Hegel’s Philosophical System
by Guanyu Guo
Religions 2025, 16(9), 1187; https://doi.org/10.3390/rel16091187 - 15 Sep 2025
Viewed by 400
Abstract
The present study interprets the indispensable mediating role of religion within Hegel’s monistic system. This study undertakes a systematic investigation of the development of Hegel’s religious thought in different periods, his logical reconstruction of multiple religions, and the positioning of religion within his [...] Read more.
The present study interprets the indispensable mediating role of religion within Hegel’s monistic system. This study undertakes a systematic investigation of the development of Hegel’s religious thought in different periods, his logical reconstruction of multiple religions, and the positioning of religion within his system. The central argument posits that religions, particularly Christianity, serve as a pivotal experiential and representational conduit that facilitates Spirit’s ascent from the inherent dualism of consciousness, as inherited from Descartes and solidified by Kant, to the monistic paradigm of the speculative concept (Begriff) or absolute Knowing (absolutes Wissen). Whilst art offers immediate sensuous intuition (Anschauung) of the Absolute, philosophy achieves pure conceptual comprehension (begreifendes Denken). The function of religion can be considered to be twofold. Firstly, it is important to note that religion provides the essential communal form for grasping substantial content. Secondly, the representational form demands inherent necessity, its own sublation (Aufhebung), and elevation to conceptual truth. The rational content of Christianity, found in the Trinity’s logical structure and the Incarnation (Menschwerdung Gottes), demands translation into the self-determining concept (Begriff). “The death of God” is posited by representation as a means to facilitate the subsequent reconstruction of the Concept. Philosophy serves to sublimate religion by preserving its truth-content, negating its inadequate form, and elevating it into pure conceptual truth. Consequently, religion functions as the indispensable “prelude to scientific truth”, and the necessary pathway to absolute knowing, where absolute Spirit achieves complete self-transparency in the speculative Concept. Hegel affirms religion’s vital mediating role not as the endpoint, but as the essential bridge enabling Spirit’s ascent from religious representation (Vorstellung) to conceptual absolute knowing. Finally, in exploring the interplay of truth and meaning, Hegel’s notions of Vorstellung and Begriff offer a speculative framework where religious meaning is embodied in logical and historical contexts, while philosophical truth transforms it into conceptual form, thus enriching both hermeneutics and dialectical understanding. Full article
20 pages, 498 KB  
Article
Are the Media Transparent in Their Use of AI? Self-Regulation and Ethical Challenges in Newsrooms in Spain
by M. Ángeles Fernández-Barrero and Carlos Serrano-Martín
Journal. Media 2025, 6(3), 152; https://doi.org/10.3390/journalmedia6030152 - 13 Sep 2025
Viewed by 571
Abstract
The integration of artificial intelligence (AI) into journalism is rapidly transforming the way news is produced, raising important questions about ethics, transparency, and professional standards. This study examines how Spanish journalists perceive and manage the use of AI in their work. A mixed [...] Read more.
The integration of artificial intelligence (AI) into journalism is rapidly transforming the way news is produced, raising important questions about ethics, transparency, and professional standards. This study examines how Spanish journalists perceive and manage the use of AI in their work. A mixed methods research design is used, combining quantitative and qualitative approaches. The quantitative component consists of a survey administered to a sample of 50 journalists working in newsrooms in various Spanish provinces, selected by random sampling. The qualitative component involves eight in-depth interviews with journalists representing various perspectives on AI use. Although AI improves efficiency in news production, it also introduces ethical concerns, particularly about transparency, authorship, and content accuracy. In the absence of formal regulation, some media and scientific institutions have begun to develop self-regulation protocols. The findings reveal widespread use of AI tools among journalists, although a minority strongly opposes them. Most media outlets lack internal policies on AI use, leading to reliance on personal self-regulation. Transparency is a major concern, as AI involvement is rarely disclosed, raising issues of trust, intellectual property, and editorial responsibility. The lack of clear internal guidelines creates uncertainty and inconsistent practices. Journalists are calling for defined regulatory frameworks to ensure ethical and transparent integration of AI. Without transparency, audience trust can be eroded and journalistic integrity can be compromised. Full article
(This article belongs to the Special Issue Reimagining Journalism in the Era of Digital Innovation)
Show Figures

Figure 1

25 pages, 8141 KB  
Article
Decoding Spatial Vitality in Historic Districts: A Grey Relational Analysis of Multidimensional Built Environment Factors in Shanghai’s Zhangyuan
by Yiming Song, Wang Zhang, Yunze Deng, Hongzhi Mo and Yuan Li
Land 2025, 14(9), 1869; https://doi.org/10.3390/land14091869 - 12 Sep 2025
Viewed by 379
Abstract
Enhancing the vitality of historic districts is a key challenge in China’s urban regeneration. This study takes Shanghai’s Zhangyuan Historic District as a case, constructing a framework with six spatial indicators—width-to-height ratio (W/H), interface transparency, connectivity, integration, Universal Thermal Climate Index (UTCI), and [...] Read more.
Enhancing the vitality of historic districts is a key challenge in China’s urban regeneration. This study takes Shanghai’s Zhangyuan Historic District as a case, constructing a framework with six spatial indicators—width-to-height ratio (W/H), interface transparency, connectivity, integration, Universal Thermal Climate Index (UTCI), and mean radiant temperature (MRT)—across spatial morphology, path accessibility, and thermal comfort. Using Grey Relational Analysis, the study quantitatively examines how these factors affect spatial vitality and pedestrian behavior. Findings indicate that, overall, W/H and connectivity are the primary drivers of vitality in plazas and alleys, while thermal comfort (MRT, UTCI) strongly affects stationary behaviors. By typology, plazas exhibit the strongest association with interface transparency (grey relational grade = 0.870), demonstrating that open sightlines and permeable interfaces promote pedestrian flow and staying. North–south alleys show pronounced associations with thermal comfort (MRT = 0.918; UTCI = 0.874), suggesting microclimate-friendly environments can substantially enhance vitality in linear walking spaces. East–west alleys are dominated by connectivity (0.831) and W/H (0.849), whereas integration shows a low grade (0.512), revealing weaker configurational coherence for this spatial type. At the micro-scale, connectivity outperforms integration in predicting pedestrian route choices, reflecting actual movement preferences. The study highlights the combined effects of multidimensional built environment factors and provides a scientific basis for targeted spatial optimization, sustainable renewal, and vitality-oriented design in historic urban areas. Full article
(This article belongs to the Section Land Use, Impact Assessment and Sustainability)
Show Figures

Figure 1

50 pages, 1057 KB  
Review
Formulation of Recombinant Therapeutic Proteins: Technological Innovation, Regulations, and Evolution Towards Buffer-Free Formulations
by Tomas Gabriel Bas
Pharmaceutics 2025, 17(9), 1183; https://doi.org/10.3390/pharmaceutics17091183 - 11 Sep 2025
Viewed by 874
Abstract
Background/Objectives: Formulating recombinant therapeutic proteins is essential to ensure their safety, efficacy, and stability. A growing trend in biopharmaceutical development is the move toward buffer-free formulations, which aim to reduce immunogenicity, improve tolerability, and simplify production. This review explores technological advances, regulatory [...] Read more.
Background/Objectives: Formulating recombinant therapeutic proteins is essential to ensure their safety, efficacy, and stability. A growing trend in biopharmaceutical development is the move toward buffer-free formulations, which aim to reduce immunogenicity, improve tolerability, and simplify production. This review explores technological advances, regulatory perspectives, and safety considerations related to this shift. Methods: A systematic documentary review was conducted using the PSALSAR framework. Scientific publications, patents, and regulatory documents (2020–2025) were retrieved from PubMed, Scopus, Web of Science, and regulatory databases (FDA, EMA). Inclusion criteria focused on recombinant proteins, buffer-free formulations, and regulatory alignment. Results: The findings reveal an increasing adoption of self-buffering strategies in high-concentration subcutaneous biologics. Technologies such as Fc-fusion, PASylation, and XTENylation enhance stability without conventional buffers. Regulatory bodies are progressively accepting minimalist formulations, provided safety and biosimilarity are demonstrated. However, intellectual property barriers limit formulation transparency. A synthesis of recent FDA and EMA approvals illustrates this formulation evolution. Conclusions: Buffer-free formulations offer a promising alternative for therapeutic protein development by improving patient experience and reducing formulation complexity. They align with biosimilar goals and regulatory trends, although long-term transparency and safety assessments remain critical for widespread adoption. Full article
(This article belongs to the Special Issue Formulation of Recombinant Therapeutic Proteins)
Show Figures

Graphical abstract

39 pages, 4081 KB  
Review
Two Sides of the Same Coin for Health: Adaptogenic Botanicals as Nutraceuticals for Nutrition and Pharmaceuticals in Medicine
by Alexander Panossian and Terrence Lemerond
Pharmaceuticals 2025, 18(9), 1346; https://doi.org/10.3390/ph18091346 - 8 Sep 2025
Viewed by 464
Abstract
Background: Adaptogens, commonly used as traditional herbal medicinal products for the relief of symptoms of stress, such as fatigue and exhaustion, belong to a category of physiologically active compounds related to the physiological process of adaptability to stressors. They are used both as [...] Read more.
Background: Adaptogens, commonly used as traditional herbal medicinal products for the relief of symptoms of stress, such as fatigue and exhaustion, belong to a category of physiologically active compounds related to the physiological process of adaptability to stressors. They are used both as pharmaceuticals in medicine and as dietary supplements or nutraceuticals in nutrition, depending on the doses, indications to treat diseases, or support health functions. However, such a dual-faced nature of adaptogens can lead to inconsistencies and contradictory outcomes from Food and Drug regulatory authorities in various countries. Aims: This narrative literature review aimed to (i) specify five steps of pharmacological testing of adaptogens, (ii) identify the sources of inconsistencies in the assessment of evidence the safety, efficacy, and quality of multitarget adaptogenic botanicals, and (iii) propose potential solutions to address some food and drug regulatory issues, specifically adaptogenic botanicals used for prevention and treatment of complex etiology diseases including stress-induced, and aging-related disorders. Overview: This critically oriented narrative review is focused on (i) five steps of pharmacological testing of adaptogens are required in a sequential order, including appropriate in vivo and in vitro models in animals, in vitro model, and mechanisms of action by a proper biochemical assay and molecular biology technique in combination with network pharmacology analysis, and clinical trials in stress-induced and aging-related disorders; (ii) the differences between the requirements for the quality of pharmaceuticals and dietary supplements of botanical origin; (iii) progress, trends, pitfalls, and challenges in the adaptogens research; (iv) inadequate assignment of some plants to adaptogens, or insufficient scientific data in case of Eurycoma longifolia; (v) inconsistencies in botanical risk assessments in the case of Withania somnifera. Conclusions: This narrative review highlights the importance of harmonized standards, transparent methodologies, and a balanced, evidence-informed approach to ensure consumers receive effective and safe botanicals. Future perspectives and proposed solutions include (i) establish internationally harmonized guidelines for evaluating botanicals based on their intended use (e.g., pharmaceutical vs. dietary supplement), incorporating traditional use data alongside modern scientific methods; (ii) encourage peer review and transparency in national assessments by mandating public disclosure of methodologies, data sources, and expert affiliations; (iii) create a tiered evidence framework that allows differentiated standards of proof for traditional botanical supplements versus pharmaceutical candidates; (iv) promote international scientific dialogs among regulators, researchers, and industry to develop consensus positions and avoid unilateral bans that may lack scientific rigor; (v) formally recognize adaptogens a category of natural products for prevention stress induced brain fatigue, behavioral, and aging related disorders. Full article
(This article belongs to the Special Issue Network Pharmacology of Natural Products, 2nd Edition)
Show Figures

Graphical abstract

44 pages, 4535 KB  
Review
The Pacific Alliance Integration Process: A Systematic Literature Review
by Antonella Alexandra Canovas Roque, Juan Carlos Daniel De Vinatea Murguía, Alexander David Perez Chamochumbi, Ricardo Alonso Quimper Roncagliolo, Ángela Isamar Tapia Ostos, Jeremy Yermain Torres Jauregui and Julio Ricardo Moscoso Cuaresma
Economies 2025, 13(9), 255; https://doi.org/10.3390/economies13090255 - 2 Sep 2025
Viewed by 1029
Abstract
The Pacific Alliance has established itself as one of the most dynamic regional economic integration initiatives, standing out for its pragmatic and consensual approach to trade, capital and people liberalisation. However, between 2020 and 2025, the bloc faced both opportunities and challenges arising [...] Read more.
The Pacific Alliance has established itself as one of the most dynamic regional economic integration initiatives, standing out for its pragmatic and consensual approach to trade, capital and people liberalisation. However, between 2020 and 2025, the bloc faced both opportunities and challenges arising from the international situation, including global tensions, internal political crises and the need for technological adaptation. Against this backdrop, this study aims to conduct a systematic review of the scientific literature published between 2020 and 2025 on the Pacific Alliance, identifying the predominant theoretical approaches and the main findings on the integration process. This systematic review followed the PRISMA methodology, which contributed fundamentally to this research by providing a structured, transparent and rigorous framework for a deeper and more informed understanding of the integration process. The results show that, despite some progress, structural limitations persist, such as asymmetries between countries, institutional obstacles and superficial integration in economic and social aspects, as well as fragmentation in academic production and little incorporation of geopolitical perspectives. This study contributes to a critical understanding of the current state and future challenges of the Pacific Alliance, offering inputs for the formulation of public policies and future research in the field of Latin American integration. Full article
(This article belongs to the Section International, Regional, and Transportation Economics)
Show Figures

Figure 1

26 pages, 1665 KB  
Review
A Review of XAI Methods Applications in Forecasting Runoff and Water Level Hydrological Tasks
by Andrei M. Bramm, Pavel V. Matrenin and Alexandra I. Khalyasmaa
Mathematics 2025, 13(17), 2830; https://doi.org/10.3390/math13172830 - 2 Sep 2025
Viewed by 696
Abstract
Modern artificial intelligence methods are increasingly applied in hydrology, particularly for forecasting water inflow into reservoirs. However, their limited interpretability constrains practical deployment in critical water resource management systems. Explainable AI offers solutions aimed at increasing the transparency of models, which makes the [...] Read more.
Modern artificial intelligence methods are increasingly applied in hydrology, particularly for forecasting water inflow into reservoirs. However, their limited interpretability constrains practical deployment in critical water resource management systems. Explainable AI offers solutions aimed at increasing the transparency of models, which makes the topic relevant in the context of developing sustainable and trusted AI systems in hydrology. Articles published in leading scientific journals in recent years were selected for the review. The selection criteria were the application of XAI methods in hydrological forecasting problems and the presence of a quantitative assessment of interpretability. The main attention is paid to approaches combining LSTM, GRU, CNN, and ensembles with XAI methods such as SHAP, LIME, Grad-CAM, and ICE. The results of the review show that XAI mechanisms increase confidence in AI forecasts, identify important meteorological features, and allow analyzing parameter interactions. However, there is a lack of standardization of interpretation, especially in problems with high-dimensional input data. The review emphasizes the need to develop robust, unified XAI approaches that can be integrated into next-generation hydrological models. Full article
(This article belongs to the Special Issue Machine Learning and Data Mining for Time Series and Model Adaptation)
Show Figures

Figure 1

18 pages, 492 KB  
Review
Consumer Psychology in Functional Beverages: From Nutritional Awareness to Habit Formation
by Tariq A. Alalwan
Beverages 2025, 11(5), 126; https://doi.org/10.3390/beverages11050126 - 1 Sep 2025
Viewed by 1085
Abstract
The functional beverage sector has experienced a remarkable transformation driven by evolving consumer decision-making patterns emphasizing therapeutic benefits alongside taste preferences. This comprehensive narrative review investigates how consumer psychology, neurobiological processes, and scientific product development converge through a hierarchical framework illustrating their dynamic [...] Read more.
The functional beverage sector has experienced a remarkable transformation driven by evolving consumer decision-making patterns emphasizing therapeutic benefits alongside taste preferences. This comprehensive narrative review investigates how consumer psychology, neurobiological processes, and scientific product development converge through a hierarchical framework illustrating their dynamic interactions. Today’s consumers exhibit unprecedented sophistication when assessing bioactive ingredients, conducting independent research using scientific databases rather than relying on conventional marketing. Our analysis explores mechanisms underlying habit development, behavioral adaptation, and social proof factors driving functional beverage integration into daily routines. We trace evolution from broad-spectrum wellness drinks toward personalized nutrition solutions, recognizing individual metabolic requirements, with consumers viewing these products as preventive health investments requiring evidence-based validation. Key findings underscore the importance of clinically validated formulations at therapeutic dosages, nutritional transparency, and understanding consumer psychology for fostering lasting consumption behaviors driven by cost–benefit analysis. Results indicate future innovations must merge sophisticated bioactive delivery technologies with insights into consumer information-seeking patterns, social validation processes, and evidence-driven decision-making mechanisms. Full article
Show Figures

Figure 1

25 pages, 4657 KB  
Article
Identifying Methodological Language in Psychology Abstracts: A Machine Learning Approach Using NLP and Embedding-Based Clustering
by Konstantinos G. Stathakis, George Papageorgiou and Christos Tjortjis
Big Data Cogn. Comput. 2025, 9(9), 224; https://doi.org/10.3390/bdcc9090224 - 29 Aug 2025
Viewed by 606
Abstract
Research articles are valuable resources for Information Retrieval and Natural Language Processing (NLP) tasks, offering opportunities to analyze key components of scholarly content. This study investigates the presence of methodological terminology in psychology research over the past 30 years (1995–2024) by applying a [...] Read more.
Research articles are valuable resources for Information Retrieval and Natural Language Processing (NLP) tasks, offering opportunities to analyze key components of scholarly content. This study investigates the presence of methodological terminology in psychology research over the past 30 years (1995–2024) by applying a novel NLP and Machine Learning pipeline to a large corpus of 85,452 abstracts, as well as the extent to which this terminology forms distinct thematic groupings. Combining glossary-based extraction, contextualized language model embeddings, and dual-mode clustering, this study offers a scalable framework for the exploration of methodological transparency in scientific text via deep semantic structures. A curated glossary of 365 method-related keywords served as a gold-standard reference for term identification, using direct and fuzzy string matching. Retrieved terms were encoded with SciBERT, averaging embeddings across contextual occurrences to produce unified vectors. These vectors were clustered using unsupervised and weighted unsupervised approaches, yielding six and ten clusters, respectively. Cluster composition was analyzed using weighted statistical measures to assess term importance within and across groups. A total of 78.16% of the examined abstracts contained glossary terms, with an average of 1.8 term per abstract, highlighting an increasing presence of methodological terminology in psychology and reflecting a shift toward greater transparency in research reporting. This work goes beyond the use of static vectors by incorporating contextual understanding in the examination of methodological terminology, while offering a scalable and generalizable approach to semantic analysis in scientific texts, with implications for meta-research, domain-specific lexicon development, and automated scientific knowledge discovery. Full article
(This article belongs to the Special Issue Machine Learning Applications in Natural Language Processing)
Show Figures

Figure 1

Back to TopTop