Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (7)

Search Parameters:
Keywords = (un)certainty

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 336 KiB  
Review
Using the Constrained Disorder Principle to Navigate Uncertainties in Biology and Medicine: Refining Fuzzy Algorithms
by Yaron Ilan
Biology 2024, 13(10), 830; https://doi.org/10.3390/biology13100830 - 16 Oct 2024
Cited by 4 | Viewed by 1177
Abstract
Uncertainty in biology refers to situations in which information is imperfect or unknown. Variability, on the other hand, is measured by the frequency distribution of observed data. Biological variability adds to the uncertainty. The Constrained Disorder Principle (CDP) defines all systems in the [...] Read more.
Uncertainty in biology refers to situations in which information is imperfect or unknown. Variability, on the other hand, is measured by the frequency distribution of observed data. Biological variability adds to the uncertainty. The Constrained Disorder Principle (CDP) defines all systems in the universe by their inherent variability. According to the CDP, systems exhibit a degree of variability necessary for their proper function, allowing them to adapt to changes in their environments. Per the CDP, while variability differs from uncertainty, it can be viewed as a regulated mechanism for efficient functionality rather than uncertainty. This paper explores the various aspects of un-certainties in biology. It focuses on using CDP-based platforms for refining fuzzy algorithms to address some of the challenges associated with biological and medical uncertainties. Developing a fuzzy decision tree that considers the natural variability of systems can help minimize uncertainty. This method can reveal previously unidentified classes, reduce the number of unknowns, improve the accuracy of modeling results, and generate algorithm outputs that are more biologically and clinically relevant. Full article
(This article belongs to the Section Theoretical Biology and Biomathematics)
14 pages, 2356 KiB  
Review
Challenges and (Un)Certainties for DNAm Age Estimation in Future
by Helena Correia Dias, Eugénia Cunha, Francisco Corte Real and Licínio Manco
Forensic Sci. 2022, 2(3), 601-614; https://doi.org/10.3390/forensicsci2030044 - 15 Sep 2022
Cited by 5 | Viewed by 2894
Abstract
Age estimation is a paramount issue in criminal, anthropological, and forensic research. Because of this, several areas of research have focused on the establishment of new approaches for age prediction, including bimolecular and anthropological methods. In recent years, DNA methylation (DNAm) has arisen [...] Read more.
Age estimation is a paramount issue in criminal, anthropological, and forensic research. Because of this, several areas of research have focused on the establishment of new approaches for age prediction, including bimolecular and anthropological methods. In recent years, DNA methylation (DNAm) has arisen as one of the hottest topics in the field. Many studies have developed age-prediction models (APMs) based on evaluation of DNAm levels of many genes in different tissue types and using different methodological approaches. However, several challenges and confounder factors should be considered before using methylation levels for age estimation in forensic contexts. To provide in-depth knowledge about DNAm age estimation (DNAm age) and to understand why it is not yet a current tool in forensic laboratories, this review encompasses the literature for the most relevant scientific works published from 2015 to 2021 to address the challenges and future directions in the field. More than 60 papers were considered focusing essentially on studies that developed models for age prediction in several sample types. Full article
(This article belongs to the Special Issue Feature Papers in Forensic Sciences in 2022)
Show Figures

Figure 1

38 pages, 5910 KiB  
Article
OC6 Phase Ia: CFD Simulations of the Free-Decay Motion of the DeepCwind Semisubmersible
by Lu Wang, Amy Robertson, Jason Jonkman, Jang Kim, Zhi-Rong Shen, Arjen Koop, Adrià Borràs Nadal, Wei Shi, Xinmeng Zeng, Edward Ransley, Scott Brown, Martyn Hann, Pranav Chandramouli, Axelle Viré, Likhitha Ramesh Reddy, Xiang Li, Qing Xiao, Beatriz Méndez López, Guillén Campaña Alonso, Sho Oh, Hamid Sarlak, Stefan Netzband, Hyunchul Jang and Kai Yuadd Show full author list remove Hide full author list
Energies 2022, 15(1), 389; https://doi.org/10.3390/en15010389 - 5 Jan 2022
Cited by 39 | Viewed by 6500
Abstract
Currently, the design of floating offshore wind systems is primarily based on mid-fidelity models with empirical drag forces. The tuning of the model coefficients requires data from either experiments or high-fidelity simulations. As part of the OC6 (Offshore Code Comparison Collaboration, Continued, with [...] Read more.
Currently, the design of floating offshore wind systems is primarily based on mid-fidelity models with empirical drag forces. The tuning of the model coefficients requires data from either experiments or high-fidelity simulations. As part of the OC6 (Offshore Code Comparison Collaboration, Continued, with Correlation, and unCertainty (OC6) is a project under the International Energy Agency Wind Task 30 framework) project, the present investigation explores the latter option. A verification and validation study of computational fluid dynamics (CFD) models of the DeepCwind semisubmersible undergoing free-decay motion is performed. Several institutions provided CFD results for validation against the OC6 experimental campaign. The objective is to evaluate whether the CFD setups of the participants can provide valid estimates of the hydrodynamic damping coefficients needed by mid-fidelity models. The linear and quadratic damping coefficients and the equivalent damping ratio are chosen as metrics for validation. Large numerical uncertainties are estimated for the linear and quadratic damping coefficients; however, the equivalent damping ratios are more consistently predicted with lower uncertainty. Some difference is observed between the experimental and CFD surge-decay motion, which is caused by mechanical damping not considered in the simulations that likely originated from the mooring setup, including a Coulomb-friction-type force. Overall, the simulations and the experiment show reasonable agreement, thus demonstrating the feasibility of using CFD simulations to tune mid-fidelity models. Full article
Show Figures

Figure 1

23 pages, 292 KiB  
Article
Youth Justice, Black Children and Young Men in Liverpool: A Story of Rac(ism), Identity and Contested Spaces
by John Wainwright, Laura Robertson, Cath Larkins and Mick Mckeown
Genealogy 2020, 4(2), 57; https://doi.org/10.3390/genealogy4020057 - 6 May 2020
Cited by 1 | Viewed by 4328
Abstract
This study explores the experiences of the black children and young men that attended a Youth Offending Team (YOT) in Liverpool, a city in the North of England, UK. It focuses on the perspectives of both the YOT practitioners and the black children/young [...] Read more.
This study explores the experiences of the black children and young men that attended a Youth Offending Team (YOT) in Liverpool, a city in the North of England, UK. It focuses on the perspectives of both the YOT practitioners and the black children/young men as they develop working relationships with each other. Through this two-way prism the back children/young men reflect on what is important to them before and after they enter the criminal justice system. Likewise, the YOT practitioners provide their understanding of the key issues in the young people’s lives—in particular, how the black children/young men made sense of their lives in Liverpool with a particular identity with place, space, class and race. A genealogy of race/class prism, along with an intersectional and appreciative inquiry methodology, was employed that encouraged the youth justice workers and young black men to explore the strengths and realities of their lives. Focus groups were undertaken with seven YOT practitioners and managers, along with semi-structured interviews with five black children/young men. The methodology focused on points of intersection of power, difference and identity. Findings that emerged from the participants included the experience of racism within the criminal justice system, the community and the wider city, along with the importance of education, employment and relations with the young people’s family. A core theme was an identity of black children/young men from a specific region. This intersection was as Scousers, black boys/young men, the contestation over space and their negotiated identity regarding race. The ambivalence and (un)certainty that these identities evoked provide possibilities for youth justice practitioners engaging with young black men involved in serious and repeat offending. Full article
15 pages, 4105 KiB  
Article
Tracking with (Un)Certainty
by Abe D. Hofman, Matthieu J. S. Brinkhuis, Maria Bolsinova, Jonathan Klaiber, Gunter Maris and Han L. J. van der Maas
J. Intell. 2020, 8(1), 10; https://doi.org/10.3390/jintelligence8010010 - 3 Mar 2020
Cited by 6 | Viewed by 5061
Abstract
One of the highest ambitions in educational technology is the move towards personalized learning. To this end, computerized adaptive learning (CAL) systems are developed. A popular method to track the development of student ability and item difficulty, in CAL systems, is the Elo [...] Read more.
One of the highest ambitions in educational technology is the move towards personalized learning. To this end, computerized adaptive learning (CAL) systems are developed. A popular method to track the development of student ability and item difficulty, in CAL systems, is the Elo Rating System (ERS). The ERS allows for dynamic model parameters by updating key parameters after every response. However, drawbacks of the ERS are that it does not provide standard errors and that it results in rating variance inflation. We identify three statistical issues responsible for both of these drawbacks. To solve these issues we introduce a new tracking system based on urns, where every person and item is represented by an urn filled with a combination of green and red marbles. Urns are updated, by an exchange of marbles after each response, such that the proportions of green marbles represent estimates of person ability or item difficulty. A main advantage of this approach is that the standard errors are known, hence the method allows for statistical inference, such as testing for learning effects. We highlight features of the Urnings algorithm and compare it to the popular ERS in a simulation study and in an empirical data example from a large-scale CAL application. Full article
(This article belongs to the Special Issue New Methods and Assessment Approaches in Intelligence Research)
Show Figures

Figure 1

26 pages, 721 KiB  
Article
Unification of Epistemic and Ontic Concepts of Information, Probability, and Entropy, Using Cognizers-System Model
by Toshiyuki Nakajima
Entropy 2019, 21(2), 216; https://doi.org/10.3390/e21020216 - 24 Feb 2019
Cited by 7 | Viewed by 4714
Abstract
Information and probability are common words used in scientific investigations. However, information and probability both involve epistemic (subjective) and ontic (objective) interpretations under the same terms, which causes controversy within the concept of entropy in physics and biology. There is another issue regarding [...] Read more.
Information and probability are common words used in scientific investigations. However, information and probability both involve epistemic (subjective) and ontic (objective) interpretations under the same terms, which causes controversy within the concept of entropy in physics and biology. There is another issue regarding the circularity between information (or data) and reality: The observation of reality produces phenomena (or events), whereas the reality is confirmed (or constituted) by phenomena. The ordinary concept of information presupposes reality as a source of information, whereas another type of information (known as it-from-bit) constitutes the reality from data (bits). In this paper, a monistic model, called the cognizers-system model (CS model), is employed to resolve these issues. In the CS model, observations (epistemic) and physical changes (ontic) are both unified as “cognition”, meaning a related state change. Information and probability, epistemic and ontic, are formalized and analyzed systematically using a common theoretical framework of the CS model or a related model. Based on the results, a perspective for resolving controversial issues of entropy originating from information and probability is presented. Full article
Show Figures

Figure 1

17 pages, 527 KiB  
Article
Using the Data Agreement Criterion to Rank Experts’ Beliefs
by Duco Veen, Diederick Stoel, Naomi Schalken, Kees Mulder and Rens Van de Schoot
Entropy 2018, 20(8), 592; https://doi.org/10.3390/e20080592 - 9 Aug 2018
Cited by 13 | Viewed by 4168 | Correction
Abstract
Experts’ beliefs embody a present state of knowledge. It is desirable to take this knowledge into account when making decisions. However, ranking experts based on the merit of their beliefs is a difficult task. In this paper, we show how experts can be [...] Read more.
Experts’ beliefs embody a present state of knowledge. It is desirable to take this knowledge into account when making decisions. However, ranking experts based on the merit of their beliefs is a difficult task. In this paper, we show how experts can be ranked based on their knowledge and their level of (un)certainty. By letting experts specify their knowledge in the form of a probability distribution, we can assess how accurately they can predict new data, and how appropriate their level of (un)certainty is. The expert’s specified probability distribution can be seen as a prior in a Bayesian statistical setting. We evaluate these priors by extending an existing prior-data (dis)agreement measure, the Data Agreement Criterion, and compare this approach to using Bayes factors to assess prior specification. We compare experts with each other and the data to evaluate their appropriateness. Using this method, new research questions can be asked and answered, for instance: Which expert predicts the new data best? Is there agreement between my experts and the data? Which experts’ representation is more valid or useful? Can we reach convergence between expert judgement and data? We provided an empirical example ranking (regional) directors of a large financial institution based on their predictions of turnover. Full article
(This article belongs to the Special Issue Bayesian Inference and Information Theory)
Show Figures

Figure 1

Back to TopTop