Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (693)

Search Parameters:
Keywords = fundamental laws

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 338 KiB  
Article
Configuration of Subjectivities and the Application of Neoliberal Economic Policies in Medellin, Colombia
by Juan David Villa-Gómez, Juan F. Mejia-Giraldo, Mariana Gutiérrez-Peña and Alexandra Novozhenina
Soc. Sci. 2025, 14(8), 482; https://doi.org/10.3390/socsci14080482 - 5 Aug 2025
Abstract
(1) Background: This article aims to understand the forms and elements through which the inhabitants of the city of Medellin have configured their subjectivity in the context of the application of neoliberal policies in the last two decades. In this way, we can [...] Read more.
(1) Background: This article aims to understand the forms and elements through which the inhabitants of the city of Medellin have configured their subjectivity in the context of the application of neoliberal policies in the last two decades. In this way, we can approach the frameworks of understanding that constitute a fundamental part of the individuation processes in which the incorporation of their subjectivities is evidenced in neoliberal contexts that, in the historical process, have been converging with authoritarian, antidemocratic and neoconservative elements. (2) Method: A qualitative approach with a hermeneutic-interpretative paradigm was used. In-depth semi-structured interviews were conducted with 41 inhabitants of Medellín who were politically identified with right-wing or center-right positions. Data analysis included thematic coding to identify patterns of thought and points of view. (3) Results: Participants associate success with individual effort and see state intervention as an obstacle to development. They reject redistributive policies, arguing that they generate dependency. In addition, they justify authoritarian models of government in the name of security and progress, from a moral superiority, which is related to a negative and stigmatizing perception of progressive sectors and a negative view of the social rule of law and public policies with social sense. (4) Conclusions: The naturalization of merit as a guiding principle, the perception of themselves as morally superior based on religious values that grant a subjective place of certainty and goodness; the criminalization of expressions of political leftism, mobilizations and redistributive reforms and support for policies that establish authoritarianism and perpetuate exclusion and structural inequalities, closes roads to a participatory democracy that enables social and economic transformations. Full article
30 pages, 1511 KiB  
Review
Environmental and Health Impacts of Pesticides and Nanotechnology as an Alternative in Agriculture
by Jesús Martín Muñoz-Bautista, Ariadna Thalía Bernal-Mercado, Oliviert Martínez-Cruz, Armando Burgos-Hernández, Alonso Alexis López-Zavala, Saul Ruiz-Cruz, José de Jesús Ornelas-Paz, Jesús Borboa-Flores, José Rogelio Ramos-Enríquez and Carmen Lizette Del-Toro-Sánchez
Agronomy 2025, 15(8), 1878; https://doi.org/10.3390/agronomy15081878 - 3 Aug 2025
Viewed by 50
Abstract
The extensive use of conventional pesticides has been a fundamental strategy in modern agriculture for controlling pests and increasing crop productivity; however, their improper application poses significant risks to human health and environmental sustainability. This review compiles scientific evidence linking pesticide exposure to [...] Read more.
The extensive use of conventional pesticides has been a fundamental strategy in modern agriculture for controlling pests and increasing crop productivity; however, their improper application poses significant risks to human health and environmental sustainability. This review compiles scientific evidence linking pesticide exposure to oxidative stress and genotoxic damage, particularly affecting rural populations and commonly consumed foods, even at levels exceeding the maximum permissible limits in fruits, vegetables, and animal products. Additionally, excessive pesticide use has been shown to alter soil microbiota, negatively compromising long-term agricultural fertility. In response to these challenges, recent advances in nanotechnology offer promising alternatives. This review highlights the development of nanopesticides designed for controlled release, improved stability, and targeted delivery of active ingredients, thereby reducing environmental contamination and increasing efficacy. Moreover, emerging nanobiosensor technologies, such as e-nose and e-tongue systems, have shown potential for real-time monitoring of pesticide residues and soil health. Although pesticides are still necessary, it is crucial to implement stricter laws and promote sustainable solutions that ensure safe and responsible agricultural practices. The need for evidence-based public policy is emphasized to regulate pesticide use and protect both human health and agricultural resources. Full article
Show Figures

Figure 1

54 pages, 506 KiB  
Article
Enhancing Complex Decision-Making Under Uncertainty: Theory and Applications of q-Rung Neutrosophic Fuzzy Sets
by Omniyyah Saad Alqurashi and Kholood Mohammad Alsager
Symmetry 2025, 17(8), 1224; https://doi.org/10.3390/sym17081224 - 3 Aug 2025
Viewed by 116
Abstract
This thesis pioneers the development of q-Rung Neutrosophic Fuzzy Rough Sets (q-RNFRSs), establishing the first theoretical framework that integrates q-Rung Neutrosophic Sets with rough approximations to break through the conventional μq+ηq+νq1 constraint of existing [...] Read more.
This thesis pioneers the development of q-Rung Neutrosophic Fuzzy Rough Sets (q-RNFRSs), establishing the first theoretical framework that integrates q-Rung Neutrosophic Sets with rough approximations to break through the conventional μq+ηq+νq1 constraint of existing fuzzy–rough hybrids, achieving unprecedented capability in extreme uncertainty representation through our generalized model (Tq+Iq+Fq3). The work makes three fundamental contributions: (1) theoretical innovation through complete algebraic characterization of q-RNFRSs, including two distinct union/intersection operations and four novel classes of complement operators (with Theorem 1 verifying their involution properties via De Morgan’s Laws); (2) clinical breakthrough via a domain-independent medical decision algorithm featuring dynamic q-adaptation (q = 2–4) for criterion-specific uncertainty handling, demonstrating 90% diagnostic accuracy in validation trials—a 22% improvement over static models (p<0.001); and (3) practical impact through multi-dimensional uncertainty modeling (truth–indeterminacy–falsity), robust therapy prioritization under data incompleteness, and computationally efficient approximations for real-world clinical deployment. Full article
(This article belongs to the Special Issue The Fusion of Fuzzy Sets and Optimization Using Symmetry)
Show Figures

Figure 1

28 pages, 352 KiB  
Article
Algorithm Power and Legal Boundaries: Rights Conflicts and Governance Responses in the Era of Artificial Intelligence
by Jinghui He and Zhenyang Zhang
Laws 2025, 14(4), 54; https://doi.org/10.3390/laws14040054 - 31 Jul 2025
Viewed by 598
Abstract
This study explores the challenges and theoretical transformations that the widespread application of AI technology in social governance brings to the protection of citizens’ fundamental rights. By examining typical cases in judicial assistance, technology-enabled law enforcement, and welfare supervision, it explains how AI [...] Read more.
This study explores the challenges and theoretical transformations that the widespread application of AI technology in social governance brings to the protection of citizens’ fundamental rights. By examining typical cases in judicial assistance, technology-enabled law enforcement, and welfare supervision, it explains how AI characteristics such as algorithmic opacity, data bias, and automated decision-making affect fundamental rights including due process, equal protection, and privacy. The article traces the historical evolution of privacy theory from physical space protection to informational self-determination and further to modern data rights, pointing out the inadequacy of traditional rights-protection paradigms in addressing the characteristics of AI technology. Through analyzing AI-governance models in the European Union, the United States, Northeast Asia, and international organizations, it demonstrates diverse governance approaches ranging from systematic risk regulation to decentralized industry regulation. With a special focus on China, the article analyzes the special challenges faced in AI governance and proposes specific recommendations for improving AI-governance paths. The article argues that only within the track of the rule of law, through continuous theoretical innovation, institutional construction, and international cooperation, can AI technology development be ensured to serve human dignity, freedom, and fair justice. Full article
20 pages, 834 KiB  
Article
Time-Fractional Evolution of Quantum Dense Coding Under Amplitude Damping Noise
by Chuanjin Zu, Baoxiong Xu, Hao He, Xiaolong Li and Xiangyang Yu
Fractal Fract. 2025, 9(8), 501; https://doi.org/10.3390/fractalfract9080501 - 30 Jul 2025
Viewed by 146
Abstract
In this paper, we investigate the memory effects introduced by the time-fractional Schrödinger equation proposed by Naber on quantum entanglement and quantum dense coding under amplitude damping noise. Two formulations are analyzed: one with fractional operations applied to the imaginary unit and one [...] Read more.
In this paper, we investigate the memory effects introduced by the time-fractional Schrödinger equation proposed by Naber on quantum entanglement and quantum dense coding under amplitude damping noise. Two formulations are analyzed: one with fractional operations applied to the imaginary unit and one without. Numerical results show that the formulation without fractional operations on the imaginary unit may be more suitable for describing non-Markovian (power-law) behavior in dissipative environments. This finding provides a more physically meaningful interpretation of the memory effects in time-fractional quantum dynamics and indirectly addresses fundamental concerns regarding the violation of unitarity and probability conservation in such frameworks. Our work offers a new perspective for the application of fractional quantum mechanics to realistic open quantum systems and shows promise in supporting the theoretical modeling of decoherence and information degradation. Full article
Show Figures

Figure 1

15 pages, 6014 KiB  
Article
Predictive Analysis of Ventilation Dust Removal Time in Tunnel Blasting Operations Based on Numerical Simulation and Orthogonal Design Method
by Yun Peng, Shunchuan Wu, Yongjun Li, Lei He and Pengfei Wang
Processes 2025, 13(8), 2415; https://doi.org/10.3390/pr13082415 - 30 Jul 2025
Viewed by 253
Abstract
To enhance the understanding of dust diffusion laws in tunnel blasting operations of metal mines and determine optimal ventilation dust removal times, a scaled physical model of a metal mine tunneling face under the China Zijin Mining Group was established based on field [...] Read more.
To enhance the understanding of dust diffusion laws in tunnel blasting operations of metal mines and determine optimal ventilation dust removal times, a scaled physical model of a metal mine tunneling face under the China Zijin Mining Group was established based on field measurements. Numerical simulation was employed to investigate airflow movement and dust migration in the tunneling roadway, and the fundamental features of airflow field and dust diffusion laws after tunnel blasting operations in the fully mechanized excavation face were revealed. The effects of three main factors included airflow rate (Q), ventilation distance (S), and tunnel length (L) on the dust removal time after tunnel blasting operations were investigated based on the orthogonal design method. Results indicated that reducing the dust concentration in the roadway to 10 mg/m3 required 53 min. The primary factors influencing dust removal time, in order of significance, were determined to be L, Q, and S. The lowest dust concentration occurs when the ventilation distance was 25 m. A predictive model for dust removal time after tunnel blasting operations was developed, establishing the relationship between dust removal time and the three factors as T = 20.7Q−0.73S0.19L0.86. Subsequent on-site validation confirmed the high accuracy of the predictive model, demonstrating its efficacy for practical applications. This study contributes a novel integration of orthogonal experimental design and validated CFD modeling to predict ventilation dust removal time, offering a practical and theoretically grounded approach for tunnel ventilation optimization. Full article
(This article belongs to the Section Particle Processes)
Show Figures

Figure 1

31 pages, 11019 KiB  
Review
A Review of Tunnel Field-Effect Transistors: Materials, Structures, and Applications
by Shupeng Chen, Yourui An, Shulong Wang and Hongxia Liu
Micromachines 2025, 16(8), 881; https://doi.org/10.3390/mi16080881 - 29 Jul 2025
Viewed by 373
Abstract
The development of an integrated circuit faces the challenge of the physical limit of Moore’s Law. One of the most important “Beyond Moore” challenges is the scaling down of Metal-Oxide-Semiconductor Field-Effect Transistors (MOSFETs) versus their increasing static power consumption. This is because, at [...] Read more.
The development of an integrated circuit faces the challenge of the physical limit of Moore’s Law. One of the most important “Beyond Moore” challenges is the scaling down of Metal-Oxide-Semiconductor Field-Effect Transistors (MOSFETs) versus their increasing static power consumption. This is because, at room temperature, the thermal emission transportation mechanism will cause a physical limitation on subthreshold swing (SS), which is fundamentally limited to a minimum value of 60 mV/decade for MOSFETs, and accompanied by an increase in off-state leakage current with the process of scaling down. Moreover, the impacts of short-channel effects on device performance also become an increasingly severe problem with channel length scaling down. Due to the band-to-band tunneling mechanism, Tunnel Field-Effect Transistors (TFETs) can reach a far lower SS than MOSFETs. Recent research works indicated that TFETs are already becoming some of the promising candidates of conventional MOSFETs for ultra-low-power applications. This paper provides a review of some advances in materials and structures along the evolutionary process of TFETs. An in-depth discussion of both experimental works and simulation works is conducted. Furthermore, the performance of TFETs with different structures and materials is explored in detail as well, covering Si, Ge, III-V compounds and 2D materials, alongside different innovative device structures. Additionally, this work provides an outlook on the prospects of TFETs in future ultra-low-power electronics and biosensor applications. Full article
(This article belongs to the Special Issue MEMS/NEMS Devices and Applications, 3rd Edition)
Show Figures

Figure 1

18 pages, 9954 KiB  
Article
Adaptive Continuous Non-Singular Terminal Sliding Mode Control for High-Pressure Common Rail Systems: Design and Experimental Validation
by Jie Zhang, Yinhui Yu, Sumin Wu, Wenjiang Zhu and Wenqian Liu
Processes 2025, 13(8), 2410; https://doi.org/10.3390/pr13082410 - 29 Jul 2025
Viewed by 235
Abstract
The High-Pressure Common Rail System (HPCRS) is designed based on fundamental hydrodynamic principles, after which this paper formally defines the key control challenges. The proposed continuous sliding mode control strategy is developed based on a non-singular terminal sliding mode framework, integrated with an [...] Read more.
The High-Pressure Common Rail System (HPCRS) is designed based on fundamental hydrodynamic principles, after which this paper formally defines the key control challenges. The proposed continuous sliding mode control strategy is developed based on a non-singular terminal sliding mode framework, integrated with an improved power reaching law. This design effectively eliminates chattering and achieves fast dynamic response with enhanced tracking precision. Subsequently, a bidirectional adaptive mechanism is integrated into the proposed control scheme to eliminate the necessity for a priori knowledge of unknown disturbances within the HPCRS. This mechanism enables real-time evaluation of the system’s state relative to a predefined detection region. To validate the effectiveness of the proposed strategy, experimental studies are conducted under three distinct operating conditions. The experimental results indicate that, compared with conventional rail pressure controllers, the proposed method achieves superior tracking accuracy, faster dynamic response, and improved disturbance rejection. Full article
(This article belongs to the Special Issue Design and Analysis of Adaptive Identification and Control)
Show Figures

Figure 1

6 pages, 1910 KiB  
Proceeding Paper
Design and Construction of an Engine Oil Viscosity Meter with Electronic Control
by Penko Mitev, Atanasi Tashev and Yordan Stoyanov
Eng. Proc. 2025, 100(1), 55; https://doi.org/10.3390/engproc2025100055 - 22 Jul 2025
Viewed by 190
Abstract
This study presents the design and implementation of a novel, sensor-based falling-sphere viscometer specifically tailored for measuring the viscosity of engine oil. The equipment utilizes a metallic sphere and two strategically placed sensors to determine the travel time over a predetermined distance within [...] Read more.
This study presents the design and implementation of a novel, sensor-based falling-sphere viscometer specifically tailored for measuring the viscosity of engine oil. The equipment utilizes a metallic sphere and two strategically placed sensors to determine the travel time over a predetermined distance within an oil-filled tube. By applying fundamental principles of fluid dynamics, including Stokes’ law, the system accurately calculates the dynamic viscosity based on the sphere’s velocity and the oil’s density. Experimental validation at particular temperature demonstrates the device’s sensitivity and reliability, which are critical for assessing oil degradation and engine performance. The simplicity and low cost of the design make it an attractive alternative to conventional, more complex viscometers. Furthermore, the automated data acquisition system reduces human error and enhances reproducibility of results. Overall, the developed instrument shows great promise for both laboratory research and practical maintenance applications in the automotive industry. Full article
Show Figures

Figure 1

24 pages, 7960 KiB  
Article
Creep Behavior and Deformation Mechanism of Aluminum Alloy: Integrating Multiscale Simulation and Experiments
by Weizheng Lu, Jianguo Wu, Jiajun Liu, Xiaoai Yi, Qiyue Zhang, Yang Chen, Jia Li and Qihong Fang
Symmetry 2025, 17(7), 1146; https://doi.org/10.3390/sym17071146 - 17 Jul 2025
Viewed by 234
Abstract
Aluminum (Al) alloys exhibit exceptional mechanical properties, seeing widespread use in various industrial fields. Here, we use a multiscale simulation method combining phase field method, dislocation dynamics, and crystal plasticity finite element method to reveal the evolution law of precipitates, the interaction mechanism [...] Read more.
Aluminum (Al) alloys exhibit exceptional mechanical properties, seeing widespread use in various industrial fields. Here, we use a multiscale simulation method combining phase field method, dislocation dynamics, and crystal plasticity finite element method to reveal the evolution law of precipitates, the interaction mechanism between dislocations and precipitates, and the grain-level creep deformation mechanism in 7A09 Al alloy under creep loading. The phase field method indicates that Al alloys tend to form fewer but larger precipitates during the creep process, under the dominant effect of stress-assisted Ostwald ripening. The dynamic equilibrium process of precipitate is not only controlled by classical diffusion mechanisms, but also closely related to the local strain field induced by dislocations and the elastic interaction between precipitates. Dislocation dynamics simulations indicate that the appearance of multiple dislocation loops around the precipitate during the creep process is the main dislocation creep deformation mechanism. A crystal plasticity finite element model is established based on experimental characterization to investigate the macroscopic creep mechanism. The dislocation climb is hindered by grain boundaries during creep, and high-density dislocation bands are formed around specific grains, promoting non-uniform plastic strain and leading to strong strain gradients. This work provides fundamental insights into understanding creep behavior and deformation mechanism of Al alloy for deep-sea environments. Full article
(This article belongs to the Section Engineering and Materials)
Show Figures

Figure 1

19 pages, 1827 KiB  
Article
Discrete Element Modeling of Concrete Under Dynamic Tensile Loading
by Ahmad Omar and Laurent Daudeville
Materials 2025, 18(14), 3347; https://doi.org/10.3390/ma18143347 - 17 Jul 2025
Viewed by 259
Abstract
Concrete is a fundamental material in structural engineering, widely used in critical infrastructure such as bridges, nuclear power plants, and dams. These structures may be subjected to extreme dynamic loads resulting from natural disasters, industrial accidents, or missile impacts. Therefore, a comprehensive understanding [...] Read more.
Concrete is a fundamental material in structural engineering, widely used in critical infrastructure such as bridges, nuclear power plants, and dams. These structures may be subjected to extreme dynamic loads resulting from natural disasters, industrial accidents, or missile impacts. Therefore, a comprehensive understanding of concrete behavior under high strain rates is essential for safe and resilient design. Experimental investigations, particularly spalling tests, have highlighted the strain-rate sensitivity of concrete in dynamic tensile loading conditions. This study presents a macroscopic 3D discrete element model specifically developed to simulate the dynamic response of concrete subjected to extreme loading. Unlike conventional continuum-based models, the proposed discrete element framework is particularly suited to capturing damage and fracture mechanisms in cohesive materials. A key innovation lies in incorporating a physically grounded strain-rate dependency directly into the local cohesive laws that govern inter-element interactions. The originality of this work is further underlined by the validation of the discrete element model under dynamic tensile loading through the simulation of spalling tests on normalstrength concrete at strain rates representative of severe impact scenarios (30–115 s−1). After calibrating the model under quasi-static loading, the simulations accurately reproduce key experimental outcomes, including rear-face velocity profiles and failure characteristics. Combined with prior validations under high confining pressure, this study reinforces the capability of the discrete element method for modeling concrete subjected to extreme dynamic loading, offering a robust tool for predictive structural assessment and design. Full article
(This article belongs to the Section Construction and Building Materials)
Show Figures

Figure 1

19 pages, 3731 KiB  
Article
Electric Field Measurement in Radiative Hyperthermia Applications
by Marco Di Cristofano, Luca Lalli, Giorgia Paglialunga and Marta Cavagnaro
Sensors 2025, 25(14), 4392; https://doi.org/10.3390/s25144392 - 14 Jul 2025
Viewed by 407
Abstract
Oncological hyperthermia (HT) is a medical technique aimed at heating a specific region of the human body containing a tumour. The heat makes the tumour cells more sensitive to the cytotoxic effects of radiotherapy and chemotherapy. Electromagnetic (EM) HT devices radiate a single-frequency [...] Read more.
Oncological hyperthermia (HT) is a medical technique aimed at heating a specific region of the human body containing a tumour. The heat makes the tumour cells more sensitive to the cytotoxic effects of radiotherapy and chemotherapy. Electromagnetic (EM) HT devices radiate a single-frequency EM field that induces a temperature increase in the treated region of the body. The typical radiative HT frequencies are between 60 and 150 MHz for deep HT applications, while 434 MHz and 915 MHz are used for superficial HT. The input EM power can reach up to 2000 W in deep HT and 250 W in superficial applications, and the E-field should be linearly polarized. This study proposes the development and use of E-field sensors to measure the distribution and evaluate the polarization of the E-field radiated by HT devices inside equivalent phantoms. This information is fundamental for the validation and assessment of HT systems. The sensor is constituted by three mutually orthogonal probes. Each probe is composed of a dipole, a diode, and a high-impedance transmission line. The fundamental difference in the operability of this sensor with respect to the standard E-field square-law detectors lies in the high-power values of the considered EM sources. Numerical analyses were performed to optimize the design of the E-field sensor in the whole radiative HT frequency range and to characterize the sensor behaviour at the power levels of HT. Then the sensor was realized, and measurements were carried out to evaluate the E-field radiated by commercial HT systems. The results show the suitability of the developed sensor to measure the E-field radiated by HT applicators. Additionally, in the measured devices, the linear polarization is evidenced. Accordingly, the work shows that in these devices, a single probe can be used to completely characterize the field distribution. Full article
(This article belongs to the Special Issue Microwaves for Biomedical Applications and Sensing)
Show Figures

Figure 1

22 pages, 814 KiB  
Article
When Institutions Cannot Keep up with Artificial Intelligence: Expiration Theory and the Risk of Institutional Invalidation
by Victor Frimpong
Adm. Sci. 2025, 15(7), 263; https://doi.org/10.3390/admsci15070263 - 7 Jul 2025
Viewed by 488
Abstract
As Artificial Intelligence systems increasingly surpass or replace traditional human roles, institutions founded on beliefs in human cognitive superiority, moral authority, and procedural oversight encounter a more profound challenge than mere disruption: expiration. This paper posits that, instead of being outperformed, many legacy [...] Read more.
As Artificial Intelligence systems increasingly surpass or replace traditional human roles, institutions founded on beliefs in human cognitive superiority, moral authority, and procedural oversight encounter a more profound challenge than mere disruption: expiration. This paper posits that, instead of being outperformed, many legacy institutions are becoming epistemically misaligned with the realities of AI-driven environments. To clarify this change, the paper presents the Expiration Theory. This conceptual model interprets institutional collapse not as a market failure but as the erosion of fundamental assumptions amid technological shifts. In addition, the paper introduces the AI Pressure Clock, a diagnostic tool that categorizes institutions based on their vulnerability to AI disruption and their capacity to adapt to it. Through an analysis across various sectors, including law, healthcare, education, finance, and the creative industries, the paper illustrates how specific systems are nearing functional obsolescence while others are actively restructuring their foundational norms. As a conceptual study, the paper concludes by highlighting the theoretical, policy, and leadership ramifications, asserting that institutional survival in the age of AI relies not solely on digital capabilities but also on the capacity to redefine the core principles of legitimacy, authority, and decision-making. Full article
Show Figures

Figure 1

14 pages, 214 KiB  
Article
The Scopes Trial and Its Long Shadow
by David H. Nikkel
Religions 2025, 16(7), 871; https://doi.org/10.3390/rel16070871 - 4 Jul 2025
Viewed by 343
Abstract
With the centennial this year of the Scopes “Monkey” Trial, this article examines the antagonistic relationship between American Christian fundamentalism and science, particularly evolution and other scientific knowledge challenging literal biblical interpretation. While the trial itself spanned only eleven days, its shadow has [...] Read more.
With the centennial this year of the Scopes “Monkey” Trial, this article examines the antagonistic relationship between American Christian fundamentalism and science, particularly evolution and other scientific knowledge challenging literal biblical interpretation. While the trial itself spanned only eleven days, its shadow has been quite long indeed. The article analyzes the background of the trial, fundamentalism then and now—including a later doubling down, contesting interpretations of the trial’s outcome, misremembrances and revisionism in the historical appropriations of the trial, and developments in evolutionary theory relevant to religion. In the process of these analyses, the article evidences the relationships of the Scopes trial on evolution and religion to law, politics, secondary and higher education, and communications and media. Finally, the article highlights past opportunities missed and lessons to be learned that might lessen conflict between religion and science in the future. Full article
23 pages, 360 KiB  
Article
Depicting Falsifiability in Algebraic Modelling
by Achim Schlather and Martin Schlather
J 2025, 8(3), 23; https://doi.org/10.3390/j8030023 - 4 Jul 2025
Viewed by 190
Abstract
This paper investigates how algebraic structures can encode epistemic limitations, with a focus on object properties and measurement. Drawing from philosophical concepts such as underdetermination, we argue that the weakening of algebraic laws can reflect foundational ambiguities in empirical access. Our approach supplies [...] Read more.
This paper investigates how algebraic structures can encode epistemic limitations, with a focus on object properties and measurement. Drawing from philosophical concepts such as underdetermination, we argue that the weakening of algebraic laws can reflect foundational ambiguities in empirical access. Our approach supplies instruments that are necessary and sufficient towards practical falsifiability. Besides introducing this new concept, we consider, exemplarily and as a starting point, the following two fundamental algebraic laws in more detail: the associative law and the commutative law. We explore and analyze weakened forms of these laws. As a mathematical feature, we demonstrate that the existence of a weak neutral element leads to the emergence of several transversal algebraic laws. Most laws are individually weaker than the combination of associativity and commutativity, but many pairs of two laws are equivalent to this combination. We also show that associativity and commutativity can be combined to a simple, single law, which we call cyclicity. We illustrate our approach with many tables and practical examples. Full article
(This article belongs to the Section Computer Science & Mathematics)
Show Figures

Figure 1

Back to TopTop