Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (12)

Search Parameters:
Keywords = mechanical reductionism

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 343 KiB  
Review
Robert Rosen’s Relational Biology Theory and His Emphasis on Non-Algorithmic Approaches to Living Systems
by Patricia A. Lane
Mathematics 2024, 12(22), 3529; https://doi.org/10.3390/math12223529 - 12 Nov 2024
Cited by 2 | Viewed by 2745
Abstract
This paper examines the use of algorithms and non-algorithmic models in mathematics and science, especially in biology, during the past century by summarizing the gradual development of a conceptual rationale for non-algorithmic models in biology. First, beginning a century ago, mathematicians found it [...] Read more.
This paper examines the use of algorithms and non-algorithmic models in mathematics and science, especially in biology, during the past century by summarizing the gradual development of a conceptual rationale for non-algorithmic models in biology. First, beginning a century ago, mathematicians found it impossible to constrain mathematics in an algorithmic straitjacket via öö’s Incompleteness Theorems, so how would it be possible in biology? By the 1930s, biology was resolutely imitating classical physics, with biologists enforcing a reductionist agenda to expunge function, purpose, teleology, and vitalism from biology. Interestingly, physicists and mathematicians often understood better than biologists that mathematical representations of living systems required different approaches than those of dead matter. Nicolas Rashevsky, the Father of Mathematical Biology, and Robert Rosen, his student, pointed out that the complex systems of life cannot be reduced to machines or mechanisms as per the Newtonian paradigm. Robert Rosen concluded that living systems are not amenable to algorithmic models that are primarily syntactical. Life requires semantics for its description. Rashevsky and Rosen pioneered Relational Biology, initially using Graph Theory to model living systems. Later, Rosen created a metabolic–repair model (M, R)-system using Category Theory to encode the basic entailments of life itself. Although reductionism still dominates in current biology, several subsequent authors have built upon the Rashevsky–Rosen intellectual foundation and have explained, extended, and explored its ramifications. Algorithmic formulations have become increasingly inadequate for investigating and modeling living systems. Biology is shifting from a science of simple systems to complex ones. This transition will only be successful once mathematics fully depicts what it means to be alive. This paper is a call to mathematicians from biologists asking for help in doing this. Full article
(This article belongs to the Special Issue Non-algorithmic Mathematical Models of Biological Organization)
20 pages, 323 KiB  
Article
Quantum Mechanics and Inclusive Materialism
by Javier Pérez-Jara
Philosophies 2024, 9(5), 140; https://doi.org/10.3390/philosophies9050140 - 3 Sep 2024
Cited by 1 | Viewed by 4338
Abstract
Since its inception, the intricate mathematical formalism of quantum mechanics has empowered physicists to describe and predict specific physical events known as quantum processes. However, this success in probabilistic predictions has been accompanied by a profound challenge in the ontological interpretation of the [...] Read more.
Since its inception, the intricate mathematical formalism of quantum mechanics has empowered physicists to describe and predict specific physical events known as quantum processes. However, this success in probabilistic predictions has been accompanied by a profound challenge in the ontological interpretation of the theory. This interpretative complexity stems from two key aspects. Firstly, quantum mechanics is a fundamental theory that, so far, is not derivable from any more basic scientific theory. Secondly, it delves into a realm of invisible phenomena that often contradicts our intuitive and commonsensical notions of matter and causality. Despite its notorious difficulties of interpretation, the most widely accepted set of views of quantum phenomena has been known as the Copenhagen interpretation since the beginning of quantum mechanics. According to these views, the correct ontological interpretation of quantum mechanics is incompatible with ontological realism in general and with philosophical materialism in particular. Anti-realist and anti-materialist interpretations of quantum matter have survived until today. This paper discusses these perspectives, arguing that materialistic interpretations of quantum mechanics are compatible with its mathematical formalism, while anti-realist and anti-materialist views are based on wrong philosophical assumptions. However, although physicalism provides a better explanation for quantum phenomena than idealism, its downward reductionism prevents it from accounting for more complex forms of matter, such as biological or sociocultural systems. Thus, the paper argues that neither physicalism nor idealism can explain the universe. I propose then a non-reductionistic form of materialism called inclusive materialism. The conclusion is that the acknowledgment of the qualitative irreducibility of ontological emergent levels above the purely physical one does not deny philosophical materialism but enriches it. Full article
(This article belongs to the Special Issue Philosophy and Quantum Mechanics)
35 pages, 466 KiB  
Article
Nonclassical Systemics of Quasicoherence: From Formal Properties to Representations of Generative Mechanisms. A Conceptual Introduction to a Paradigm-Shift
by Gianfranco Minati
Systems 2019, 7(4), 51; https://doi.org/10.3390/systems7040051 - 28 Nov 2019
Cited by 4 | Viewed by 4681
Abstract
In this article, we consider how formal models and properties of emergence, e.g., long-range correlations, power laws, and self-similarity are usually platonically considered to represent the essence of the phenomenon, more specifically, their acquired properties, e.g., coherence, and not their generative mechanisms. Properties [...] Read more.
In this article, we consider how formal models and properties of emergence, e.g., long-range correlations, power laws, and self-similarity are usually platonically considered to represent the essence of the phenomenon, more specifically, their acquired properties, e.g., coherence, and not their generative mechanisms. Properties are assumed to explain, rather than represent, real processes of emergence. Conversely, real phenomenological processes are intended to be approximations or degenerations of their essence. By contrast, here, we consider the essence as a simplification of the phenomenological complexity. It is assumed to be acceptable that such simplification neglects several aspects (e.g., incompleteness, inhomogeneities, instabilities, irregularities, and variations) of real phenomena in return for analytical tractability. Within this context, such a trade-off is a kind of reductionism when dealing with complex phenomena. Methodologically, we propose a paradigmatic change for systems science equivalent to the one that occurred in Physics from object to field, namely, a change from interactional entities to domains intended as extensions of fields, or multiple fields, as it were. The reason to introduce such a paradigm shift is to make nonidealist approaches suitable for dealing with more realistic quasicoherence, when the coherence does not consistently apply to all the composing entities, but rather, different forms of coherence apply. As a typical general interdisciplinary case, we focus on so-called collective behaviors. The goal of this paper is to introduce the concepts of domain and selection mechanisms which are suitable to represent the generative mechanisms of quasicoherence of collective behavior. Domains are established by self-tracking entities such as financial or are effectively GPS-detectable. Such domains allow the profiling of collective behavior. Selection mechanisms are based on learning techniques or cognitive approaches for social systems. Full article
1 pages, 128 KiB  
Editorial
A Brief Introduction to The First International Forum on Methodology of Information Ecology
by Yixin Zhong
Proceedings 2017, 1(3), 158; https://doi.org/10.3390/proceedings1030158 - 24 Jul 2017
Viewed by 1691
Abstract
Having comprehensively investigated the research in the information discipline, we found that there exist numerous mutually isolated theories, but the knowledge about the mutual interrelations was lacking and hence also there is a lack of global theories, principles and common laws of the [...] Read more.
Having comprehensively investigated the research in the information discipline, we found that there exist numerous mutually isolated theories, but the knowledge about the mutual interrelations was lacking and hence also there is a lack of global theories, principles and common laws of the discipline. These phenomena are obviously a result mainly from the employment of the methodology of mechanical reductionism that seeks and focuses on characteristic constituent parts, neglecting the role of interactions, and context thus assuming that the whole is simply a collection of independent parts. Full article
3 pages, 158 KiB  
Proceeding Paper
Information Ecology
by Yixin Zhong
Proceedings 2017, 1(3), 139; https://doi.org/10.3390/IS4SI-2017-04004 - 9 Jun 2017
Cited by 2 | Viewed by 1637
Abstract
The purpose of the paper is trying to make a strong appeal to information researchers for taking serious concern with the issue of scientific methodology employed in the discipline so far. Whether it is appropriate? Or it is needed for changing? Full article
Show Figures

Figure 1

3 pages, 174 KiB  
Proceeding Paper
Information Ecology and Information Studies
by Yixin Zhong
Proceedings 2017, 1(3), 200; https://doi.org/10.3390/IS4SI-2017-04038 - 9 Jun 2017
Cited by 1 | Viewed by 1759
Abstract
Scientific methodology is widely accepted as macroscopic reflect of the scientific view on one hand and as the general guideline to a certain class of research works on the other hand. Therefore, methodology employment is extremely crucial in scientific research. Whether the methodology [...] Read more.
Scientific methodology is widely accepted as macroscopic reflect of the scientific view on one hand and as the general guideline to a certain class of research works on the other hand. Therefore, methodology employment is extremely crucial in scientific research. Whether the methodology employed in a field of research works is proper would, to a large extent, determine whether the achievements in the field of research could be made. However, there is no one single general methodology existed that can be applied to all fields of research with success. Which of the methodology should be employed in a certain field of research depends on which kind of phenomenon is studied. This paper will discuss the methodology consideration for information Studies and the methodology of information ecology is recommended. Full article
Show Figures

Figure 1

5 pages, 196 KiB  
Proceeding Paper
Why Transdisciplinary Framework Is Necessary for Information Studies?
by Liqian Zhou
Proceedings 2017, 1(3), 155; https://doi.org/10.3390/IS4SI-2017-03990 - 9 Jun 2017
Viewed by 1706
Abstract
Information studies pursuing a unified theory of information are now trapped in dilemmas because of the hard problems of information, which involve purpose, function, referen, value, etc. Pan-informationalism takes information for granted and considers it as a basic property of the cosmos or [...] Read more.
Information studies pursuing a unified theory of information are now trapped in dilemmas because of the hard problems of information, which involve purpose, function, referen, value, etc. Pan-informationalism takes information for granted and considers it as a basic property of the cosmos or being priori to physical properties. It avoids rather than solves the problem. The mainstream of information studies takes the position of methodological reductionism that reducing information to a property that can be quantitatively measured. It is helpful but leaves something essential behind. Transdisciplinary approach takes information as a phenomenon has multiple levels and dimensions that cannot be reduced to but complementary to each other. Analogous to principle of complementarity in quantum mechanics, every level and dimension of information cannot be mathematically transformed to each other but are necessary for explaining information. The shifts between different levels and dimensions are not transformation in mathematic sense but perspective conversion like Gestalt switch. They constitute of ecology of information together. In this spirit, Brier’s cybersemiotics and Deacon’s theory nested hierarchy of information basing on emergent dynamics give us insightful framework to investigate information. Full article
22 pages, 1292 KiB  
Review
Physiological Dynamics in Demyelinating Diseases: Unraveling Complex Relationships through Computer Modeling
by Jay S. Coggan, Stefan Bittner, Klaus M. Stiefel, Sven G. Meuth and Steven A. Prescott
Int. J. Mol. Sci. 2015, 16(9), 21215-21236; https://doi.org/10.3390/ijms160921215 - 7 Sep 2015
Cited by 28 | Viewed by 19551
Abstract
Despite intense research, few treatments are available for most neurological disorders. Demyelinating diseases are no exception. This is perhaps not surprising considering the multifactorial nature of these diseases, which involve complex interactions between immune system cells, glia and neurons. In the case of [...] Read more.
Despite intense research, few treatments are available for most neurological disorders. Demyelinating diseases are no exception. This is perhaps not surprising considering the multifactorial nature of these diseases, which involve complex interactions between immune system cells, glia and neurons. In the case of multiple sclerosis, for example, there is no unanimity among researchers about the cause or even which system or cell type could be ground zero. This situation precludes the development and strategic application of mechanism-based therapies. We will discuss how computational modeling applied to questions at different biological levels can help link together disparate observations and decipher complex mechanisms whose solutions are not amenable to simple reductionism. By making testable predictions and revealing critical gaps in existing knowledge, such models can help direct research and will provide a rigorous framework in which to integrate new data as they are collected. Nowadays, there is no shortage of data; the challenge is to make sense of it all. In that respect, computational modeling is an invaluable tool that could, ultimately, transform how we understand, diagnose, and treat demyelinating diseases. Full article
(This article belongs to the Special Issue Advances in Multiple Sclerosis)
Show Figures

Figure 1

32 pages, 1541 KiB  
Review
Philosophical Basis and Some Historical Aspects of Systems Biology: From Hegel to Noble - Applications for Bioenergetic Research
by Valdur Saks, Claire Monge and Rita Guzun
Int. J. Mol. Sci. 2009, 10(3), 1161-1192; https://doi.org/10.3390/ijms10031161 - 13 Mar 2009
Cited by 42 | Viewed by 19562
Abstract
We live in times of paradigmatic changes for the biological sciences. Reductionism, that for the last six decades has been the philosophical basis of biochemistry and molecular biology, is being displaced by Systems Biology, which favors the study of integrated systems. Historically, Systems [...] Read more.
We live in times of paradigmatic changes for the biological sciences. Reductionism, that for the last six decades has been the philosophical basis of biochemistry and molecular biology, is being displaced by Systems Biology, which favors the study of integrated systems. Historically, Systems Biology - defined as the higher level analysis of complex biological systems - was pioneered by Claude Bernard in physiology, Norbert Wiener with the development of cybernetics, and Erwin Schrödinger in his thermodynamic approach to the living. Systems Biology applies methods inspired by cybernetics, network analysis, and non-equilibrium dynamics of open systems. These developments follow very precisely the dialectical principles of development from thesis to antithesis to synthesis discovered by Hegel. Systems Biology opens new perspectives for studies of the integrated processes of energy metabolism in different cells. These integrated systems acquire new, system-level properties due to interaction of cellular components, such as metabolic compartmentation, channeling and functional coupling mechanisms, which are central for regulation of the energy fluxes. State of the art of these studies in the new area of Molecular System Bioenergetics is analyzed. Full article
(This article belongs to the Special Issue Molecular System Bioenergetics)
Show Figures

10 pages, 310 KiB  
Article
Paradigm errors in the old biomedical science
by Albertas Skurvydas
Medicina 2008, 44(5), 356; https://doi.org/10.3390/medicina44050046 - 15 May 2008
Cited by 3 | Viewed by 1002
Abstract
The aim of this article was to review the basic drawbacks of the deterministic and reductionistic thinking in biomedical science and to provide ways for dealing with them. The present paradigm of research in biomedical science has not got rid of the errors [...] Read more.
The aim of this article was to review the basic drawbacks of the deterministic and reductionistic thinking in biomedical science and to provide ways for dealing with them. The present paradigm of research in biomedical science has not got rid of the errors of the old science yet, i.e. the errors of absolute determinism and reductionism. These errors restrict the view and thinking of scholars engaged in the studies of complex and dynamic phenomena and mechanisms. Recently, discussions on science paradigm aimed at spreading the new science paradigm that of complex dynamic systems as well as chaos theory are in progress all over the world. It is for the nearest future to show which of the two, the old or the new science, will be the winner. We have come to the main conclusion that deterministic and reductionistic thinking applied in improper way can cause substantial damage rather than prove benefits for biomedicine science. Full article
7 pages, 166 KiB  
Article
Chinese Medicine and Biomodulation in Cancer Patients—Part One
by S. M. Sagar and R. K. Wong
Curr. Oncol. 2008, 15(1), 42-48; https://doi.org/10.3747/co.2008.197 - 1 Jan 2008
Cited by 64 | Viewed by 1165
Abstract
Traditional Chinese Medicine (tcm) may be integrated with conventional Western medicine to enhance the care of patients with cancer. Although tcm is normally implemented as a whole system, recent reductionist research suggests mechanisms for the effects of acupuncture, herbs, and nutrition [...] Read more.
Traditional Chinese Medicine (tcm) may be integrated with conventional Western medicine to enhance the care of patients with cancer. Although tcm is normally implemented as a whole system, recent reductionist research suggests mechanisms for the effects of acupuncture, herbs, and nutrition within the scientific model of biomedicine. The health model of Chinese medicine accommodates physical and pharmacologic interventions within the framework of a body–mind network. A Cartesian split does not occur within this model, but to allow for scientific exploration within the restrictions of positivism, reductionism, and controls for confounding factors, the components must necessarily be separated. Still, whole-systems research is important to evaluate effectiveness when applying the full model in clinical practice. Scientific analysis provides a mechanistic understanding of the processes that will improve the design of clinical studies and enhance safety. Enough preliminary evidence is available to encourage quality clinical trials to evaluate the efficacy of integrating tcm into Western cancer care. Full article
6 pages, 151 KiB  
Article
Foundations of Information Science Selected papers from FIS 2002
by Pedro C. Marijuán
Entropy 2003, 5(2), 214-219; https://doi.org/10.3390/e5020214 - 30 Jan 2003
Cited by 8 | Viewed by 6259
Abstract
The accompanying papers in the first issue of Entropy, volume 5, 2003 were presented at the electronic conference on Foundations of Information Science FIS 2002 (http://www.mdpi.net/fis2002/). The running title of this FIS e-conference was THE NATURE OF INFORMATION: CONCEPTIONS, MISCONCEPTIONS, AND PARADOXES. It [...] Read more.
The accompanying papers in the first issue of Entropy, volume 5, 2003 were presented at the electronic conference on Foundations of Information Science FIS 2002 (http://www.mdpi.net/fis2002/). The running title of this FIS e-conference was THE NATURE OF INFORMATION: CONCEPTIONS, MISCONCEPTIONS, AND PARADOXES. It was held on the Internet from 6 to 10 May 2002, and was followed by a series of discussions –structured as focused sessions– which took place in the net from 10 May 2002 until 31 January 2003 (more than 400 messages were exchanged, see: http://fis.iguw.tuwien.ac.at/mailings/). This Introduction will briefly survey the problems around the concept of information, will present the central ideas of the FIS initiative, and will contrast some of the basic differences between information and mechanics (reductionism). Full article
Back to TopTop