Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (10)

Search Parameters:
Keywords = Ockham

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 461 KiB  
Article
A Deflationary Account of Information in Terms of Probability
by Riccardo Manzotti
Entropy 2025, 27(5), 514; https://doi.org/10.3390/e27050514 - 11 May 2025
Cited by 1 | Viewed by 888
Abstract
In this paper, I argue that information is nothing more than an abstract object; therefore, it does not exist fundamentally. It is neither a concrete physical entity nor a form of “stuff” that “flows” through communication channels or that is “carried” by vehicles [...] Read more.
In this paper, I argue that information is nothing more than an abstract object; therefore, it does not exist fundamentally. It is neither a concrete physical entity nor a form of “stuff” that “flows” through communication channels or that is “carried” by vehicles or that is stored in memories, messages, books, or brains—these are misleading metaphors. To support this thesis, I adopt three different approaches. First, I present a series of concrete cases that challenge our commonsensical belief that information is a real entity. Second, I apply Eleaticism (the principle that entities lacking causal efficacy do not exist). Finally, I provide a mathematical derivation showing that information reduces to probability and is therefore unnecessary both ontologically and epistemically. In conclusion, I maintain that information is a causally redundant epistemic construct that does not exist fundamentally, regardless of its remarkable epistemic convenience. What, then, is information? It is merely a very efficient way of describing reality—a manner of speaking, nothing more. Full article
(This article belongs to the Special Issue Integrated Information Theory and Consciousness II)
13 pages, 330 KiB  
Article
Future Actuality and Truth Ascriptions
by Andrea Iacona and Giuseppe Spolaore
Philosophies 2025, 10(2), 41; https://doi.org/10.3390/philosophies10020041 - 5 Apr 2025
Viewed by 439
Abstract
One question that arises in connection with Ockhamism, and that perhaps has not yet received the attention it deserves, is how a coherent formal account of truth ascriptions can be provided by using a suitable truth predicate in the object language. We address [...] Read more.
One question that arises in connection with Ockhamism, and that perhaps has not yet received the attention it deserves, is how a coherent formal account of truth ascriptions can be provided by using a suitable truth predicate in the object language. We address this question and show its implications for some semantic issues that have been discussed in the literature on future contingents. Arguably, understanding how truth ascriptions work at the formal level helps to gain a deeper insight into Ockhamism itself. Full article
(This article belongs to the Special Issue Exploring Concepts of Time and Tense)
Show Figures

Figure 1

22 pages, 800 KiB  
Article
A Relational Semantics for Ockham’s Modalities
by Davide Falessi and Fabien Schang
Axioms 2023, 12(5), 445; https://doi.org/10.3390/axioms12050445 - 30 Apr 2023
Cited by 3 | Viewed by 1696
Abstract
This article aims at providing some extension of the modal square of opposition in the light of Ockham’s account of modal operators. Moreover, we set forth some significant remarks on the de re–de dicto distinction and on the modal operator of contingency by [...] Read more.
This article aims at providing some extension of the modal square of opposition in the light of Ockham’s account of modal operators. Moreover, we set forth some significant remarks on the de re–de dicto distinction and on the modal operator of contingency by means of a set-theoretic algebra called numbering semantics. This generalization starting from Ockham’s account of modalities will allow us to take into consideration whether Ockham’s account holds water or not, and in which case it should be changed. Full article
(This article belongs to the Special Issue Modal Logic and Logical Geometry)
Show Figures

Figure 1

16 pages, 293 KiB  
Article
Getting Real: Ockham on the Human Contribution to the Nature and Production of Artifacts
by Jenny Pelletier
Philosophies 2022, 7(5), 90; https://doi.org/10.3390/philosophies7050090 - 23 Aug 2022
Cited by 8 | Viewed by 2924
Abstract
Given his known predilection for ontological parsimony, Ockham’s ontology of artifacts is unsurprisingly reductionist: artifacts are nothing over and above their existing and appropriately ordered parts. However, the case of artifacts is notable in that they are real objects that human artisans produce [...] Read more.
Given his known predilection for ontological parsimony, Ockham’s ontology of artifacts is unsurprisingly reductionist: artifacts are nothing over and above their existing and appropriately ordered parts. However, the case of artifacts is notable in that they are real objects that human artisans produce by bringing about a real change: they spatially rearrange existing natural thing(s) or their parts for the sake of some end. This article argues that the human contribution to the nature and production of artifacts is two-fold: (1) the artisan’s cognitive grasp of her expertise and her decision to deploy that expertise are the two efficient causes necessary to explain the existence of an artifact, and (2) the purpose that the artisan had in mind when she decided to make an artifact fixes the function(s) of the artifact such that an artisan’s purpose is the final cause necessary to explain what an artifact is. Artifacts indeed exist, owing what they are and that they are to intelligent and volitional human activity, which Ockham never denies. The article submits that a myopic focus on Ockham’s indisputable reductionism does not exhaust what is metaphysically interesting and relevant about artifacts. Full article
(This article belongs to the Special Issue Art vs Nature: The Ontology of Artifacts in the Long Middle Ages)
22 pages, 5361 KiB  
Review
How the Big Bang Ends Up Inside a Black Hole
by Enrique Gaztanaga
Universe 2022, 8(5), 257; https://doi.org/10.3390/universe8050257 - 21 Apr 2022
Cited by 17 | Viewed by 5725
Abstract
The standard model of cosmology assumes that our Universe began 14 Gyrs (billion years) ago from a singular Big Bang creation. This can explain a vast range of different astrophysical data from a handful of free cosmological parameters. However, we have no direct [...] Read more.
The standard model of cosmology assumes that our Universe began 14 Gyrs (billion years) ago from a singular Big Bang creation. This can explain a vast range of different astrophysical data from a handful of free cosmological parameters. However, we have no direct evidence or fundamental understanding of some key assumptions: Inflation, Dark Matter and Dark Energy. Here we review the idea that cosmic expansion originates instead from gravitational collapse and bounce. The collapse generates a Black Hole (BH) of mass M5×1022M that formed 25 Gyrs ago. As there is no pressure support, the cold collapse can continue inside in free fall until it reaches atomic nuclear saturation (GeV), when is halted by Quantum Mechanics, as two particles cannot occupy the same quantum state. The collapse then bounces like a core-collapse supernovae, producing the Big Bang expansion. Cosmic acceleration results from the BH event horizon. During collapse, perturbations exit the horizon to re-enter during expansion, giving rise to the observed universe without the need for Inflation or Dark Energy. Using Ockham’s razor, this makes the BH Universe (BHU) model more compelling than the standard singular Big Bang creation. Full article
(This article belongs to the Special Issue Alternative Gravities and Fundamental Cosmology)
Show Figures

Figure 1

27 pages, 29320 KiB  
Review
Nanoparticles to Target and Treat Macrophages: The Ockham’s Concept?
by Mireia Medrano-Bosch, Alazne Moreno-Lanceta and Pedro Melgar-Lesmes
Pharmaceutics 2021, 13(9), 1340; https://doi.org/10.3390/pharmaceutics13091340 - 26 Aug 2021
Cited by 25 | Viewed by 5614
Abstract
Nanoparticles are nanomaterials with three external nanoscale dimensions and an average size ranging from 1 to 1000 nm. Nanoparticles have gained notoriety in technological advances due to their tunable physical, chemical, and biological characteristics. However, the administration of functionalized nanoparticles to living beings [...] Read more.
Nanoparticles are nanomaterials with three external nanoscale dimensions and an average size ranging from 1 to 1000 nm. Nanoparticles have gained notoriety in technological advances due to their tunable physical, chemical, and biological characteristics. However, the administration of functionalized nanoparticles to living beings is still challenging due to the rapid detection and blood and tissue clearance by the mononuclear phagocytic system. The major exponent of this system is the macrophage. Regardless the nanomaterial composition, macrophages can detect and incorporate foreign bodies by phagocytosis. Therefore, the simplest explanation is that any injected nanoparticle will be probably taken up by macrophages. This explains, in part, the natural accumulation of most nanoparticles in the spleen, lymph nodes, and liver (the main organs of the mononuclear phagocytic system). For this reason, recent investigations are devoted to design nanoparticles for specific macrophage targeting in diseased tissues. The aim of this review is to describe current strategies for the design of nanoparticles to target macrophages and to modulate their immunological function involved in different diseases with special emphasis on chronic inflammation, tissue regeneration, and cancer. Full article
(This article belongs to the Special Issue Nanomaterials: Immunological Perspective)
Show Figures

Graphical abstract

2 pages, 531 KiB  
Editorial
SARS-CoV-2, “Common Cold” Coronaviruses’ Cross-Reactivity and “Herd Immunity”: The Razor of Ockham (1285-1347)?
by Nicola Petrosillo
Infect. Dis. Rep. 2020, 12(2), 8647; https://doi.org/10.4081/idr.2020.8647 - 29 May 2020
Cited by 3 | Viewed by 1419
Abstract
After the rapid spread of coronavirus-19 infectious disease (COVID-19) worldwide between February and April 2020, with a total of 5,267,419 confirmed cases and 341,155 deaths as of May 25, 2020, in the last weeks we are observing a decrease in new infections in [...] Read more.
After the rapid spread of coronavirus-19 infectious disease (COVID-19) worldwide between February and April 2020, with a total of 5,267,419 confirmed cases and 341,155 deaths as of May 25, 2020, in the last weeks we are observing a decrease in new infections in European countries, and the confirmed cases are not as severe as in the past, so much so that the number of patients transferred to intensive care for the worsening of the systemic and pulmonary disease is dramatically decreasing. [...] Full article
22 pages, 4259 KiB  
Article
Bayesian Finite Element Model Updating and Assessment of Cable-Stayed Bridges Using Wireless Sensor Data
by Parisa Asadollahi, Yong Huang and Jian Li
Sensors 2018, 18(9), 3057; https://doi.org/10.3390/s18093057 - 12 Sep 2018
Cited by 41 | Viewed by 6175
Abstract
We focus on a Bayesian inference framework for finite element (FE) model updating of a long-span cable-stayed bridge using long-term monitoring data collected from a wireless sensor network (WSN). A robust Bayesian inference method is proposed which marginalizes the prediction-error precisions and applies [...] Read more.
We focus on a Bayesian inference framework for finite element (FE) model updating of a long-span cable-stayed bridge using long-term monitoring data collected from a wireless sensor network (WSN). A robust Bayesian inference method is proposed which marginalizes the prediction-error precisions and applies Transitional Markov Chain Monte Carlo (TMCMC) algorithm. The proposed marginalizing error precision is compared with other two treatments of prediction-error precisions, including the constant error precisions and updating error precisions through theoretical analysis and numerical investigation based on a bridge FE model. TMCMC is employed to draw samples from the posterior probability density function (PDF) of the structural model parameters and the uncertain prediction-error precision parameters if required. It is found that the proposed Bayesian inference method with prediction-error precisions marginalized as “nuisance” parameters produces an FE model with more accurate posterior uncertainty quantification and robust modal property prediction. When applying the identified modal parameters from acceleration data collected during a one-year period from the large-scale WSN on the bridge, we choose two candidate model classes using different parameter grouping based on the clustering results from a sensitivity analysis and apply Bayes’ Theorem at the model class level. By implementing the TMCMC sampler, both the posterior distributions of the structural model parameters and the plausibility of the two model classes are characterized given the real data. Computation of the posterior probabilities over the candidate model classes provides a procedure for Bayesian model class assessment, where the computation automatically implements Bayesian Ockham razor that trades off between data-fitting and model complexity, which penalizes model classes that “over-fit” the data. The results of FE model updating and assessment based on the real data using the proposed method show that the updated FE model can successfully predict modal properties of the structural system with high accuracy. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

22 pages, 303 KiB  
Article
A Complete Theory of Everything (Will Be Subjective)
by Marcus Hutter
Algorithms 2010, 3(4), 329-350; https://doi.org/10.3390/a3040329 - 29 Sep 2010
Cited by 16 | Viewed by 14357
Abstract
Increasingly encompassing models have been suggested for our world. Theories range from generally accepted to increasingly speculative to apparently bogus. The progression of theories from ego- to geo- to helio-centric models to universe and multiverse theories and beyond was accompanied by a dramatic [...] Read more.
Increasingly encompassing models have been suggested for our world. Theories range from generally accepted to increasingly speculative to apparently bogus. The progression of theories from ego- to geo- to helio-centric models to universe and multiverse theories and beyond was accompanied by a dramatic increase in the sizes of the postulated worlds, with humans being expelled from their center to ever more remote and random locations. Rather than leading to a true theory of everything, this trend faces a turning point after which the predictive power of such theories decreases (actually to zero). Incorporating the location and other capacities of the observer into such theories avoids this problem and allows to distinguish meaningful from predictively meaningless theories. This also leads to a truly complete theory of everything consisting of a (conventional objective) theory of everything plus a (novel subjective) observer process. The observer localization is neither based on the controversial anthropic principle, nor has it anything to do with the quantum-mechanical observation process. The suggested principle is extended to more practical (partial, approximate, probabilistic, parametric) world models (rather than theories of everything). Finally, I provide a justification of Ockham’s razor, and criticize the anthropic principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma. Full article
Show Figures

Graphical abstract

3 pages, 576 KiB  
Article
Ockham’s Razor is not so Sharp
by Mark A. Lewis, Kartik Agusala and Yuval Raizen
Infect. Dis. Rep. 2010, 2(2), e10; https://doi.org/10.4081/idr.2010.e10 - 23 Aug 2010
Cited by 2 | Viewed by 1
Abstract
A 39-year-old male with newly diagnosed HIV had cavitary pneumonia initially attributed to Pneumocystis jirovecii but actually caused by Rhodococcus equi. After neurological deterior­ation, he was found to have intracerebral lesions caused by Toxoplasma gondii. This case underscores the inability to [...] Read more.
A 39-year-old male with newly diagnosed HIV had cavitary pneumonia initially attributed to Pneumocystis jirovecii but actually caused by Rhodococcus equi. After neurological deterior­ation, he was found to have intracerebral lesions caused by Toxoplasma gondii. This case underscores the inability to rely on the search for a unifying diagnosis (Ockham’s Razor) in HIV-infected patients. Full article
Back to TopTop