Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (415)

Search Parameters:
Keywords = true progression

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 3995 KiB  
Article
Future Illiteracies—Architectural Epistemology and Artificial Intelligence
by Mustapha El Moussaoui
Architecture 2025, 5(3), 53; https://doi.org/10.3390/architecture5030053 - 25 Jul 2025
Viewed by 267
Abstract
In the age of artificial intelligence (AI), architectural practice faces a paradox of immense potential and creeping standardization. As humans are increasingly relying on AI-generated outputs, architecture risks becoming a spectacle of repetition—a shuffling of data that neither truly innovates nor progresses vertically [...] Read more.
In the age of artificial intelligence (AI), architectural practice faces a paradox of immense potential and creeping standardization. As humans are increasingly relying on AI-generated outputs, architecture risks becoming a spectacle of repetition—a shuffling of data that neither truly innovates nor progresses vertically in creative depth. This paper explores the critical role of data in AI systems, scrutinizing the training datasets that form the basis of AI’s generative capabilities and the implications for architectural practice. We argue that when architects approach AI passively, without actively engaging their own creative and critical faculties, they risk becoming passive users locked in an endless loop of horizontal expansion without meaningful vertical growth. By examining the epistemology of architecture in the AI age, this paper calls for a paradigm where AI serves as a tool for vertical and horizontal growth, contingent on human creativity and agency. Only by mastering this dynamic relationship can architects avoid the trap of passive, standardized design and unlock the true potential of AI. Full article
(This article belongs to the Special Issue AI as a Tool for Architectural Design and Urban Planning)
Show Figures

Figure 1

20 pages, 6563 KiB  
Article
Determining the Structural Characteristics of Farmland Shelterbelts in a Desert Oasis Using LiDAR
by Xiaoxiao Jia, Huijie Xiao, Zhiming Xin, Junran Li and Guangpeng Fan
Forests 2025, 16(8), 1221; https://doi.org/10.3390/f16081221 - 24 Jul 2025
Viewed by 139
Abstract
The structural analysis of shelterbelts forms the foundation of their planning and management, yet the scientific and effective quantification of shelterbelt structures requires further investigation. This study developed an innovative heterogeneous analytical framework, integrating three key methodologies: the LeWoS algorithm for wood–leaf separation, [...] Read more.
The structural analysis of shelterbelts forms the foundation of their planning and management, yet the scientific and effective quantification of shelterbelt structures requires further investigation. This study developed an innovative heterogeneous analytical framework, integrating three key methodologies: the LeWoS algorithm for wood–leaf separation, TreeQSM for structural reconstruction, and 3D alpha-shape spatial quantification, using terrestrial laser scanning (TLS) technology. This framework was applied to three typical farmland shelterbelts in the Ulan Buh Desert oasis, enabling the first precise quantitative characterization of structural components during the leaf-on stage. The results showed the following to be true: (1) The combined three-algorithm method achieved ≥90.774% relative accuracy in extracting structural parameters for all measured traits except leaf surface area. (2) Branch length, diameter, surface area, and volume decreased progressively from first- to fourth-order branches, while branch angles increased with ascending branch order. (3) The trunk, branch, and leaf components exhibited distinct vertical stratification. Trunk volume and surface area decreased linearly with height, while branch and leaf volumes and surface areas followed an inverted U-shaped distribution. (4) Horizontally, both surface area density (Scd) and volume density (Vcd) in each cube unit exhibited pronounced edge effects. Specifically, the Scd and Vcd were greatest between 0.33 and 0.60 times the shelterbelt’s height (H, i.e., mid-canopy). In contrast, the optical porosity (Op) was at a minimum of 0.43 H to 0.67 H, while the volumetric porosity (Vp) was at a minimum at 0.25 H to 0.50 H. (5) The proposed volumetric stratified porosity (Vsp) metric provides a scientific basis for regional farmland shelterbelt management strategies. This three-dimensional structural analytical framework enables precision silviculture, with particular relevance to strengthening ecological barrier efficacy in arid regions. Full article
(This article belongs to the Section Forest Inventory, Modeling and Remote Sensing)
Show Figures

Figure 1

19 pages, 507 KiB  
Review
Radiomics and Radiogenomics in Differentiating Progression, Pseudoprogression, and Radiation Necrosis in Gliomas
by Sohil Reddy, Tyler Lung, Shashank Muniyappa, Christine Hadley, Benjamin Templeton, Joel Fritz, Daniel Boulter, Keshav Shah, Raj Singh, Simeng Zhu, Jennifer K. Matsui and Joshua D. Palmer
Biomedicines 2025, 13(7), 1778; https://doi.org/10.3390/biomedicines13071778 - 21 Jul 2025
Viewed by 341
Abstract
Over recent decades, significant advancements have been made in the treatment and imaging of gliomas. Conventional imaging techniques, such as MRI and CT, play critical roles in glioma diagnosis and treatment but often fail to distinguish between tumor pseudoprogression (Psp) and radiation necrosis [...] Read more.
Over recent decades, significant advancements have been made in the treatment and imaging of gliomas. Conventional imaging techniques, such as MRI and CT, play critical roles in glioma diagnosis and treatment but often fail to distinguish between tumor pseudoprogression (Psp) and radiation necrosis (RN) versus true progression (TP). Emerging fields like radiomics and radiogenomics are addressing these challenges by extracting quantitative features from medical images and correlating them with genomic data, respectively. This article will discuss several studies that show how radiomic features (RFs) can aid in better patient stratification and prognosis. Radiogenomics, particularly in predicting biomarkers such as MGMT promoter methylation and 1p/19q codeletion, shows potential in non-invasive diagnostics. Radiomics also offers tools for predicting tumor recurrence (rBT), essential for treatment management. Further research is needed to standardize these methods and integrate them into clinical practice. This review underscores radiomics and radiogenomics’ potential to revolutionize glioma management, marking a significant shift towards precision neuro-oncology. Full article
(This article belongs to the Special Issue Mechanisms and Novel Therapeutic Approaches for Gliomas)
Show Figures

Figure 1

18 pages, 276 KiB  
Article
The Soul at Prayer
by Richard G. T. Gipps
Religions 2025, 16(7), 928; https://doi.org/10.3390/rel16070928 - 18 Jul 2025
Viewed by 271
Abstract
Wittgenstein lists prayer as a distinct language-game, but leaves to others the investigation of its character. Formulating it as “conversation with God” is correct but potentially unhelpful, in part because it presupposes that we can understand what God is independently of knowing what [...] Read more.
Wittgenstein lists prayer as a distinct language-game, but leaves to others the investigation of its character. Formulating it as “conversation with God” is correct but potentially unhelpful, in part because it presupposes that we can understand what God is independently of knowing what it is to pray. But by situating the language-game in the context of our human form of life we make better progress. The discussion of this paper, the focus of which is Christian prayer, first reminds us of what it is to have a soul life—i.e., a life in which hope, conscience, and vitality are interpenetrating elements. It next sketches a more distinctly Christian anthropology in which our lives our understood as marred by pride, lack of trust and openness, and ingratitude. Against this backdrop, prayer can be understood for what it is as the soul coming out of its proud retreat, speaking in its own voice, owning its distortions, acknowledging its gratitude, and pleading its true desires. And God can be understood as (inter alia) that to which prayer is principally offered. Full article
(This article belongs to the Special Issue New Work on Wittgenstein's Philosophy of Religion)
14 pages, 486 KiB  
Review
Bisphenol A Promotes the Progression of Hormone-Sensitive Breast Cancers Through Several Inflammatory Pathways
by Michael Thoene, Kamila Zglejc-Waszak, Marcin Jozwik and Joanna Wojtkiewicz
Cancers 2025, 17(14), 2373; https://doi.org/10.3390/cancers17142373 - 17 Jul 2025
Viewed by 421
Abstract
Background/Objectives: Bisphenol A (BPA) is found throughout the environment and exposure to it has been shown to cause several health problems, including cancer. The problem with BPA is that it is a xenoestrogen that is chemically very similar to 17β-estradiol. Chronic exposure [...] Read more.
Background/Objectives: Bisphenol A (BPA) is found throughout the environment and exposure to it has been shown to cause several health problems, including cancer. The problem with BPA is that it is a xenoestrogen that is chemically very similar to 17β-estradiol. Chronic exposure to BPA overstimulates the estrogen receptors and leads to inflammation that triggers several pathways leading to cancer progression. This is especially true in the case of hormone-sensitive breast cancers. This article reviewed the main pathways thought to be involved in the formation and/or progression of the most common forms of hormone-sensitive breast cancers due to BPA exposure. The main results were compiled and presented in tables along with a more detailed discussion of each pathway within the text. In most cases, chronic BPA exposure led to inflammation, which then triggered pathways leading to cancer stem cell formation and maintenance. In other cases, BPA exposure led to the formation of reactive oxygen species that damaged DNA and caused the formation of mutated p53 and tumorigenesis. Conclusions: The article summarizes the key pathways that are currently known, pertaining to how BPA leads to the progression and maintenance of breast cancer. The article then concludes by discussing how prenatal and perinatal BPA exposure may also predispose women to hormone-sensitive breast cancers later in life. Full article
Show Figures

Figure 1

26 pages, 2596 KiB  
Article
DFPoLD: A Hard Disk Failure Prediction on Low-Quality Datasets
by Shuting Wei, Xiaoyu Lu, Hongzhang Yang, Chenfeng Tu, Jiangpu Guo, Hailong Sun and Yu Feng
Informatics 2025, 12(3), 73; https://doi.org/10.3390/informatics12030073 - 16 Jul 2025
Viewed by 279
Abstract
Hard disk failure prediction is an important proactive maintenance method for storage systems. Recent years have seen significant progress in hard disk failure prediction using high-quality SMART datasets. However, in industrial applications, data loss often occurs during SMART data collection, transmission, and storage. [...] Read more.
Hard disk failure prediction is an important proactive maintenance method for storage systems. Recent years have seen significant progress in hard disk failure prediction using high-quality SMART datasets. However, in industrial applications, data loss often occurs during SMART data collection, transmission, and storage. Existing machine learning-based hard disk failure prediction models perform poorly on low-quality datasets. Therefore, this paper proposes a hard disk fault prediction technique based on low-quality datasets. Firstly, based on the original Backblaze dataset, we construct a low-quality dataset, Backblaze-, by simulating sector damage in actual scenarios and deleting 10% to 99% of the data. Time series features like the Absolute Sum of First Difference (ASFD) were introduced to amplify the differences between positive and negative samples and reduce the sensitivity of the model to SMART data loss. Considering the impact of different quality datasets on time window selection, we propose a time window selection formula that selects different time windows based on the proportion of data loss. It is found that the poorer the dataset quality, the longer the time window selection should be. The proposed model achieves a True Positive Rate (TPR) of 99.46%, AUC of 0.9971, and F1 score of 0.9871, with a False Positive Rate (FPR) under 0.04%, even with 80% data loss, maintaining performance close to that on the original dataset. Full article
(This article belongs to the Section Big Data Mining and Analytics)
Show Figures

Figure 1

22 pages, 2261 KiB  
Article
Learning Deceptive Strategies in Adversarial Settings: A Two-Player Game with Asymmetric Information
by Sai Krishna Reddy Mareddy and Dipankar Maity
Appl. Sci. 2025, 15(14), 7805; https://doi.org/10.3390/app15147805 - 11 Jul 2025
Viewed by 350
Abstract
This study explores strategic deception and counter-deception in multi-agent reinforcement learning environments for a police officer–robber game. The research is motivated by real-world scenarios where agents must operate with partial observability and adversarial intent. We develop a suite of progressively complex grid-based environments [...] Read more.
This study explores strategic deception and counter-deception in multi-agent reinforcement learning environments for a police officer–robber game. The research is motivated by real-world scenarios where agents must operate with partial observability and adversarial intent. We develop a suite of progressively complex grid-based environments featuring dynamic goals, fake targets, and navigational obstacles. Agents are trained using deep Q-networks (DQNs) with game-theoretic reward shaping to encourage deceptive behavior in the robber and intent inference in the police officer. The robber learns to reach the true goal while misleading the police officer, and the police officer adapts to infer the robber’s intent and allocate resources effectively. The environments include fixed and dynamic layouts with varying numbers of goals and obstacles, allowing us to evaluate scalability and generalization. Experimental results demonstrate that the agents converge to equilibrium-like behaviors across all settings. The inclusion of obstacles increases complexity but also strengthens learned policies when guided by reward shaping. We conclude that integrating game theory with deep reinforcement learning enables the emergence of robust, deceptive strategies and effective counter-strategies, even in dynamic, high-dimensional environments. This work advances the design of intelligent agents capable of strategic reasoning under uncertainty and adversarial conditions. Full article
(This article belongs to the Special Issue Research Progress on the Application of Multi-agent Systems)
Show Figures

Figure 1

21 pages, 964 KiB  
Article
Innovation in Timber Processing—A Case Study on Low-Grade Resource Utilisation for High-Grade Timber Products
by Sebastian Klein, Benoit Belleville, Giorgio Marfella, Rodney Keenan and Robert L. McGavin
Forests 2025, 16(7), 1127; https://doi.org/10.3390/f16071127 - 8 Jul 2025
Viewed by 332
Abstract
Native forest timber supplies are declining, and industry needs to do more with less to meet growing demand for wood products. An Australian-based, vertically integrated timber manufacturing business is commissioning a spindleless lathe to produce engineered wood products from small logs. The literature [...] Read more.
Native forest timber supplies are declining, and industry needs to do more with less to meet growing demand for wood products. An Australian-based, vertically integrated timber manufacturing business is commissioning a spindleless lathe to produce engineered wood products from small logs. The literature on innovation in timber manufacturing was found to generally focus on technical innovation, with relatively little use of market-oriented concepts and theory. This was particularly true in the Australian context. Using a market-oriented case study approach, this research assessed innovation in the business. It aimed to inform industry-wide innovation approaches to meet market demand in the face of timber supply challenges. Interviews were conducted with key personnel at the firm. Data and outputs were produced to facilitate comparison to existing research and conceptual frameworks. The business was found to empower key staff and willingly access knowledge, information and data from outside its corporate domain. It was also found to prioritise corporate goals outside of traditional goals of profit and competitive advantage. This was shown to increase willingness to try new things at the mill and increase the chances that new approaches would succeed. Thinking outside of the corporate domain was shown to allow access to resources that the firm could not otherwise count on. It is recommended that wood processing businesses seek to emulate this element of the case study, and that academia and the broader sector examine further the potential benefits of using enterprise and market-oriented lenses to better utilise available resources and maintain progress towards corporate goals. Full article
Show Figures

Figure 1

23 pages, 863 KiB  
Article
GLR: Graph Chain-of-Thought with LoRA Fine-Tuning and Confidence Ranking for Knowledge Graph Completion
by Yifei Chen, Xuliang Duan and Yan Guo
Appl. Sci. 2025, 15(13), 7282; https://doi.org/10.3390/app15137282 - 27 Jun 2025
Viewed by 642
Abstract
In knowledge graph construction, missing facts often lead to incomplete structures, thereby limiting the performance of downstream applications. Although recent knowledge graph completion (KGC) methods based on representation learning have achieved notable progress, they still suffer from two fundamental limitations, namely the lack [...] Read more.
In knowledge graph construction, missing facts often lead to incomplete structures, thereby limiting the performance of downstream applications. Although recent knowledge graph completion (KGC) methods based on representation learning have achieved notable progress, they still suffer from two fundamental limitations, namely the lack of structured reasoning capabilities and the inability to assess the confidence of their predictions, which often results in unreliable outputs. We propose the GLR framework, which integrates Graph Chain-of-Thought (Graph-CoT) reasoning, LoRA fine-tuning, and the P(True)-based confidence evaluation mechanism. In the KGC task, this approach effectively enhances the reasoning ability and prediction reliability of large language models (LLMs). Specifically, Graph-CoT introduces local subgraph structures to guide LLMs in performing graph-constrained, step-wise reasoning, improving their ability to model multi-hop relational patterns. Complementing this, LoRA-based fine-tuning enables efficient adaptation of LLMs to the KGC scenario with minimal computational overhead, further enhancing the model’s capability for graph-structured reasoning. Moreover, the P(True) mechanism quantifies the reliability of candidate entities, improving the robustness of ranking and the controllability of outputs, thereby enhancing the credibility and interpretability of model predictions in knowledge reasoning tasks. We conducted systematic experiments on the standard KGC datasets FB15K-237, WN18RR, and UMLS, which demonstrate the effectiveness and robustness of the GLR framework. Notably, GLR achieves a Mean Reciprocal Rank (MRR) of 0.507 on FB15K-237, marking a 6.8% improvement over the best recent instruction-tuned method, DIFT combined with CoLE (MRR = 0.439). GLR also maintains significant performance advantages on WN18RR and UMLS, verifying its effectiveness in enhancing both the structured reasoning capabilities and the prediction reliability of LLMs for KGC tasks. These results indicate that GLR offers a unified and scalable solution to enhance structure-aware reasoning and output reliability of LLMs in KGC. Full article
Show Figures

Figure 1

12 pages, 679 KiB  
Article
A Novel Echocardiographic Index (Modified-Left-Atrium-to-Aorta Ratio) for Quantifying Left Atrial Size and Differentiating Stages in Dogs with Myxomatous Mitral Valve Disease
by Minsuk Kim, Minwoong Seo and Chul Park
Animals 2025, 15(12), 1820; https://doi.org/10.3390/ani15121820 - 19 Jun 2025
Viewed by 448
Abstract
Myxomatous mitral valve disease (MMVD) is the most common heart disease in small-breed dogs, and accurate assessment of the left atrial (LA) size is essential for diagnosis and management. The traditional echocardiographic method, LA-to-Aorta (LA/Ao) ratio, is widely used, but evaluates LA size [...] Read more.
Myxomatous mitral valve disease (MMVD) is the most common heart disease in small-breed dogs, and accurate assessment of the left atrial (LA) size is essential for diagnosis and management. The traditional echocardiographic method, LA-to-Aorta (LA/Ao) ratio, is widely used, but evaluates LA size in only one view. This study introduces a novel index—the modified-LA/Ao (M-LA/Ao) ratio—which combines two echocardiographic views to better reflect the true LA size. This study retrospectively analyzed thoracic radiographs and echocardiograms from 136 dogs, including healthy controls and dogs with MMVD classified into stages B1, B2, and C according to ACVIM guidelines. The ability of the M-LA/Ao ratio is compared to conventional indices using correlation analysis and receiver operating characteristic (ROC) curves. The M-LA/Ao ratio showed a strong correlation with existing indices and slightly improved discrimination between ACVIM stages B1 and B2, although performance between stages B2 and C was similar to the LA/Ao ratio. Intra- and interobserver variability were also acceptable. Our findings suggest that the M-LA/Ao ratio may provide a practical and sensitive method to evaluate LA enlargement in early-stage MMVD, helping clinicians detect subtle cardiac remodeling before progression to advanced disease. Full article
(This article belongs to the Section Veterinary Clinical Studies)
Show Figures

Figure 1

15 pages, 3542 KiB  
Article
Longitudinal Overlap and Metabolite Analysis in Spectroscopic MRI-Guided Proton Beam Therapy in Pediatric High-Grade Glioma
by Abinand C. Rejimon, Anuradha G. Trivedi, Vicki Huang, Karthik K. Ramesh, Natia Esiashvilli, Eduard Schreibmann, Hyunsuk Shim, Kartik Reddy and Bree R. Eaton
Tomography 2025, 11(6), 71; https://doi.org/10.3390/tomography11060071 - 19 Jun 2025
Viewed by 433
Abstract
Background: Pediatric high-grade glioma (pHGG) is a highly aggressive cancer with unique biology distinct from adult high-grade glioma, limiting the effectiveness of standard treatment protocols derived from adult research. Objective: The purpose of this report is to present preliminary results from an ongoing [...] Read more.
Background: Pediatric high-grade glioma (pHGG) is a highly aggressive cancer with unique biology distinct from adult high-grade glioma, limiting the effectiveness of standard treatment protocols derived from adult research. Objective: The purpose of this report is to present preliminary results from an ongoing pilot study integrating spectroscopic magnetic resonance imaging (sMRI) to guide proton beam therapy and longitudinal imaging analysis in pediatric patients with high-grade glioma (pHGG). Methods: Thirteen pediatric patients under 21 years old with supratentorial WHO grade III-IV glioma underwent baseline and serial whole-brain spectroscopic MRI alongside standard structural MRIs. Radiation targets were defined using T1-weighted contrast enhanced, T2-FLAIR, and Cho/NAA ≥ 2X maps. Longitudinal analyses included voxel-level metabolic change maps and spatial overlap metrics comparing pre-proton therapy and post-. Results: Six patients had sufficient longitudinal data; five received sMRI-guided PBT. Significant positive correlation (R2 = 0.89, p < 0.0001) was observed between T2-FLAIR and Cho/NAA ≥ 2X volumes. Voxel-level difference maps of Cho/NAA and Choline revealed dynamic metabolic changes across follow-up scans. Analyzing Cho/NAA and Cho changes over time allowed differentiation between true progression and pseudoprogression, which conventional MRI alone struggles to achieve. Conclusions: Longitudinal sMRI enhanced metabolic tracking in pHGG, detects early tumor changes, and refines RT targeting beyond structural imaging. This first in-kind study highlights the potential of sMRI biomarkers in tracking treatment effects and emphasizes the complementary roles of metabolic and radiographic metrics in evaluating therapy response in pHGG. Full article
(This article belongs to the Section Cancer Imaging)
Show Figures

Figure 1

28 pages, 11302 KiB  
Article
Mechanical Response and Failure Mechanisms of Block Caving Bottom Structures Under Dynamic Conditions Induced by Slope Rockfalls
by Xinglong Feng, Guangquan Li, Zeyue Wang, Xiongpeng Zhu, Zhenggao Huang and Hang Lin
Appl. Sci. 2025, 15(12), 6867; https://doi.org/10.3390/app15126867 - 18 Jun 2025
Viewed by 291
Abstract
The stability of bottom structures in block caving mines is significantly challenged by impact loads generated from large rockfalls and ore collapses on slopes. This study aims to investigate the mechanical response and failure characteristics of bottom structures under such dynamic and cyclic [...] Read more.
The stability of bottom structures in block caving mines is significantly challenged by impact loads generated from large rockfalls and ore collapses on slopes. This study aims to investigate the mechanical response and failure characteristics of bottom structures under such dynamic and cyclic loading conditions. Discrete element methods (DEMs) were employed to simulate the impact load amplitudes caused by large rockfalls on bottom structures. Specimens with identical mechanical properties to the bottom structure were fabricated at a 1:100 scale, based on the principle of similarity ratio tests. Three distinct types of impact loads were identified and analyzed: overall impact from large-scale slope collapses, localized impact from partial rock and soil mass collapses, and continuous multiple impacts from progressive slope failures. True triaxial tests were conducted to evaluate the mechanical response of the bottom structure under these loading scenarios. The results indicate that while overall and multiple impact loads from slope collapses do not lead to catastrophic failure of the bottom structure, severe damage occurs under a 100 m thickness of ore and large block impacts. Specifically, the inner walls of ore accumulation troughs peel off, and ore pillars between troughs fracture and fail. This study highlights the need for advanced experimental and numerical approaches to accurately predict the stability and failure modes of bottom structures under complex loading conditions. Full article
(This article belongs to the Special Issue Advances and Techniques in Rock Fracture Mechanics)
Show Figures

Figure 1

32 pages, 1220 KiB  
Article
Income and Subjective Well-Being: The Importance of Index Choice for Sustainable Economic Development
by Tetsuya Tsurumi and Shunsuke Managi
Sustainability 2025, 17(12), 5266; https://doi.org/10.3390/su17125266 - 6 Jun 2025
Viewed by 667
Abstract
The relationship between income and subjective well-being (SWB) has been widely studied. While previous research has shown that the correlation between income and SWB is not always strong, there is limited research examining how the choice of SWB index influences this relationship. Drawing [...] Read more.
The relationship between income and subjective well-being (SWB) has been widely studied. While previous research has shown that the correlation between income and SWB is not always strong, there is limited research examining how the choice of SWB index influences this relationship. Drawing on survey data collected from 32 countries between 2015 and 2017, this study explores how the income–SWB relationship varies across different SWB indices. The dataset encompasses both developed and developing nations. We analyzed six types of SWB indices documented in the literature—covering a broader range than is typically included—and conducted comparative analyses. To account for the possibility of a nonlinear relationship between income and these SWB measures, we used a semiparametric approach by applying generalized additive models. Our findings show that these six indices can be categorized into three groups: (1) mental health and affect balance, (2) subjective happiness and eudaimonia, and (3) life satisfaction and the Cantril Ladder. These results underscore the significant impact that the selected SWB index can have on the income–SWB relationship. While economic development is often assumed to enhance SWB, our analysis reveals that this relationship does not hold consistently across all SWB indicators. In particular, certain indicators show little or no improvement in well-being despite increasing income levels, suggesting the presence of excessive or inefficient consumption that fails to contribute to genuine human flourishing. These findings challenge the conventional growth-centric paradigm and call for a deeper societal and academic inquiry into what constitutes “true prosperity.” From a sustainability perspective, aligning economic progress with authentic improvements in well-being is essential. This requires not only more careful selection and interpretation of SWB metrics, but also a broader re-evaluation of consumption patterns and policy goals to ensure that future development contributes meaningfully to human and ecological well-being. Full article
(This article belongs to the Section Health, Well-Being and Sustainability)
Show Figures

Figure 1

25 pages, 2176 KiB  
Review
AI-Driven Chemical Design: Transforming the Sustainability of the Pharmaceutical Industry
by Antonio Ruiz-Gonzalez
Future Pharmacol. 2025, 5(2), 24; https://doi.org/10.3390/futurepharmacol5020024 - 29 May 2025
Viewed by 1263
Abstract
The pharmaceutical industry faces mounting pressure to reduce its environmental impact while maintaining innovation in drug development. Artificial intelligence (AI) has emerged as a transformative tool across healthcare and drug discovery, yet its potential to drive sustainability by improving molecular design remains underexplored. [...] Read more.
The pharmaceutical industry faces mounting pressure to reduce its environmental impact while maintaining innovation in drug development. Artificial intelligence (AI) has emerged as a transformative tool across healthcare and drug discovery, yet its potential to drive sustainability by improving molecular design remains underexplored. This review critically examines the applications of AI in molecular design that can support in advancing greener pharmaceutical practices across the entire drug life cycle—from design and synthesis to waste management and solvent optimisation. We explore how AI-driven models are being used to personalise dosing, reduce pharmaceutical waste, and design biodegradable drugs with enhanced environmental compatibility. Significant advances have also been made in the predictive modelling of pharmacokinetics, drug–polymer interactions, and polymer biodegradability. AI’s role in the synthesis of active pharmaceutical compounds, including catalysts, enzymes, solvents, and synthesis pathways, is also examined. We highlight recent breakthroughs in protein engineering, biocatalyst stability, and heterogeneous catalyst screening using generative and language models. This review also explores opportunities and limitations in the field. Despite progress, several limitations constrain impact. Many AI models are trained on small or inconsistent datasets or rely on computationally intensive inputs that limit scalability. Moreover, a lack of standardised performance metrics and life cycle assessments prevents the robust evaluation of AI’s true environmental benefits. In particular, the environmental impact of AI-driven molecules and synthesis pathways remains poorly quantified due to limited data on emissions, waste, and energy usage at the compound level. Finally, a summary of challenges and future directions in the field is provided. Full article
(This article belongs to the Special Issue Feature Papers in Future Pharmacology 2025)
Show Figures

Figure 1

18 pages, 670 KiB  
Review
Targeting Obesity in Cardiovascular Disease Management: Cardiac Adipose Tissue Is a Real Biomarker!
by Saverio D’Elia, Ettore Luisi, Achille Solimene, Chiara Serpico, Mariarosaria Morello, Gisella Titolo, Valentina Maria Caso, Francesco S. Loffredo, Paolo Golino, Giovanni Cimmino and Francesco Natale
Targets 2025, 3(2), 17; https://doi.org/10.3390/targets3020017 - 23 May 2025
Viewed by 494
Abstract
Background: Obesity has been defined as a true worldwide “pandemic” by the World Health Organization and represents one of the major public health problems. It is associated with a reduction in life expectancy of about 7–8 years due to related cardiovascular diseases such [...] Read more.
Background: Obesity has been defined as a true worldwide “pandemic” by the World Health Organization and represents one of the major public health problems. It is associated with a reduction in life expectancy of about 7–8 years due to related cardiovascular diseases such as arterial hypertension, metabolic syndrome, insulin resistance, type 2 diabetes mellitus, and dyslipidemia. Adipose tissue is not merely a fat storage site but a true endocrine and immunologically active organ that secretes hormones and mediators (adipokines), influencing cardiovascular risk and host physiology. Objective: This review summarizes the current understanding of the role of epicardial adipose tissue (EAT) in cardiovascular disease pathophysiology and discusses its clinical diagnostic and therapeutic implications. Methods: A narrative non-systematic review was conducted focusing on recent literature concerning the biological and clinical aspects of cardiac adipose tissue, with particular emphasis on epicardial adipose tissue. The review examined its gene expression profile, secretory function, and interaction with cardiovascular structures and diseases. Findings: There are different types of adipose tissue, including cardiac adipose tissue, which comprises epicardial and pericardial (or paracardiac) fractions. Epicardial adipose tissue is unique due to its proximity to the heart and a distinct gene expression profile compared to other adipose depots such as visceral and subcutaneous fat. EAT plays a crucial role in the development and progression of cardiovascular diseases with high morbidity and mortality, acting both as a metabolic and inflammatory mediator. Conclusion: Cardiac adipose tissue, particularly EAT, is a key player in cardiometabolic disease. Understanding its pathophysiological role and incorporating imaging tools to evaluate EAT may enhance cardiovascular risk stratification and disease management. Full article
Show Figures

Figure 1

Back to TopTop