# Evidence and Credibility: Full Bayesian Significance Test for Precise Hypotheses

^{*}

## Abstract

**:**

## 1. Introduction

## 2. The Evidence Calculus

^{*}as the maximum of the posterior density over the null hypothesis, attained at the argument $\theta *$,

^{*}for Examples 2 and 3 of Section 4.

^{*}. That is, the evidence of the null hypothesis is

^{*}is “large”, it means that the null set is in a region of low probability and the evidence in the data is against the null hypothesis. On the other hand, if the probability of T

^{*}is “small”, then the null set is in a region of high probability and the evidence in the data is in favor of the null hypothesis.

## 3. Numerical Computation

^{n}, and the hypothesis is defined as a further restricted subset ${\Theta}_{0}\subset \Theta \subseteq {R}^{n}$. Usually, ${\Theta}_{0}$is defined by vector valued inequality and equality constraints:

- Numerical Optimization step:$$\theta *\in \mathrm{arg}\underset{\theta \in {\Theta}_{0}}{\mathrm{max}}f(\theta ),\varphi =f*=f(\theta *)$$
- Numerical Integration step:

^{*}and to calculate $\kappa *$ can be used under general conditions. Our purpose, however, is to discuss precise hypothesis testing, under absolute continuity of the posterior probability model, the case for which most solutions presented in the literature are controversial.

## 4. Examples

- Our measure of evidence, Ev, for each d;
- the p-value, pV obtained by the ${\chi}^{2}$ test; that is, the tail area;
- the Bayes Factor,$$BF=\frac{\mathrm{Pr}\left\{{\Theta}_{0}\right\}\mathrm{Pr}\left\{d|{\Theta}_{0}\right\}}{\left(1-\mathrm{Pr}\left\{{\Theta}_{0}\right\}\right)\mathrm{Pr}\left\{d|\Theta -{\Theta}_{0}\right\}};\mathrm{and}$$
- the posterior probability of H,$$PP=\mathrm{Pr}\left\{{\Theta}_{0}|d\right\}={\left\{\text{\hspace{0.17em}}1+{(BF)}^{-1}\right\}}^{-1}.$$

#### 4.1. Success rate in standard binomial model

d | Ev | PV | BF | PP |
---|---|---|---|---|

0 | 0.00 | 0.00 | 0.00 | 0.00 |

1 | 0.00 | 0.00 | 0.00 | 0.00 |

2 | 0.00 | 0.00 | 0.00 | 0.00 |

3 | 0.00 | 0.00 | 0.02 | 0.02 |

4 | 0.01 | 0.01 | 0.10 | 0.09 |

5 | 0.02 | 0.03 | 0.31 | 0.24 |

6 | 0.06 | 0.07 | 0.78 | 0.44 |

7 | 0.16 | 0.18 | 1.55 | 0.61 |

8 | 0.35 | 0.37 | 2.52 | 0.72 |

9 | 0.64 | 0.65 | 3.36 | 0.77 |

10 | 1.00 | 1.00 | 3.70 | 0.79 |

#### 4.2. Homogeneity test in 2× 2 contingency table

Homogeneity | Hardy-Weinberg | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|

x | y | Ev | pV | BF | PP | x_{1} | x_{3} | Ev | pV | BF | PP |

5 | 0 | 0.05 | 0.02 | 0.25 | 0.20 | 1 | 2 | 0.01 | 0.00 | 0.01 | 0.01 |

5 | 1 | 0.18 | 0.08 | 0.87 | 0.46 | 1 | 3 | 0.01 | 0.01 | 0.04 | 0.04 |

5 | 2 | 0.43 | 0.21 | 1.70 | 0.63 | 1 | 4 | 0.04 | 0.02 | 0.11 | 0.10 |

5 | 3 | 0.71 | 0.43 | 2.47 | 0.71 | 1 | 5 | 0.09 | 0.04 | 0.25 | 0.20 |

5 | 4 | 0.93 | 0.71 | 2.95 | 0.75 | 1 | 6 | 0.18 | 0.08 | 0.46 | 0.32 |

5 | 5 | 1.00 | 1.00 | 3.05 | 0.75 | 1 | 7 | 0.31 | 0.15 | 0.77 | 0.44 |

5 | 6 | 0.94 | 0.72 | 2.80 | 0.74 | 1 | 8 | 0.48 | 0.26 | 1.16 | 0.54 |

5 | 7 | 0.77 | 0.49 | 2.31 | 0.70 | 1 | 9 | 0.66 | 0.39 | 1.59 | 0.61 |

5 | 8 | 0.58 | 0.31 | 1.75 | 0.64 | 1 | 10 | 0.83 | 0.57 | 2.00 | 0.67 |

5 | 9 | 0.39 | 0.18 | 1.21 | 0.55 | 1 | 11 | 0.95 | 0.77 | 2.34 | 0.70 |

5 | 10 | 0.24 | 0.10 | 0.77 | 0.43 | 1 | 12 | 1.00 | 0.99 | 2.55 | 0.72 |

10 | 0 | 0.00 | 0.00 | 0.00 | 0.00 | 1 | 13 | 0.96 | 0.78 | 2.57 | 0.72 |

10 | 1 | 0.00 | 0.00 | 0.02 | 0.02 | 1 | 14 | 0.84 | 0.55 | 2.39 | 0.71 |

10 | 2 | 0.01 | 0.01 | 0.07 | 0.06 | 1 | 15 | 0.66 | 0.33 | 2.05 | 0.67 |

10 | 3 | 0.05 | 0.02 | 0.19 | 0.16 | 1 | 16 | 0.47 | 0.16 | 1.58 | 0.61 |

10 | 4 | 0.12 | 0.05 | 0.41 | 0.29 | 1 | 17 | 0.27 | 0.05 | 1.06 | 0.51 |

10 | 5 | 0.24 | 0.10 | 0.77 | 0.43 | 1 | 18 | 0.12 | 0.00 | 0.58 | 0.37 |

10 | 6 | 0.41 | 0.20 | 1.23 | 0.55 | 5 | 0 | 0.02 | 0.01 | 0.05 | 0.05 |

10 | 7 | 0.61 | 0.34 | 1.74 | 0.63 | 5 | 1 | 0.09 | 0.04 | 0.25 | 0.20 |

10 | 8 | 0.81 | 0.53 | 2.21 | 0.69 | 5 | 2 | 0.29 | 0.14 | 0.60 | 0.38 |

10 | 9 | 0.95 | 0.75 | 2.54 | 0.72 | 5 | 3 | 0.61 | 0.34 | 1.00 | 0.50 |

10 | 10 | 1.00 | 1.00 | 2.66 | 0.73 | 5 | 4 | 0.89 | 0.65 | 1.29 | 0.56 |

12 | 0 | 0.00 | 0.00 | 0.00 | 0.00 | 5 | 5 | 1.00 | 1.00 | 1.34 | 0.57 |

12 | 1 | 0.00 | 0.00 | 0.00 | 0.00 | 5 | 6 | 0.90 | 0.66 | 1.18 | 0.54 |

12 | 2 | 0.00 | 0.00 | 0.01 | 0.01 | 5 | 7 | 0.66 | 0.39 | 0.89 | 0.47 |

12 | 3 | 0.01 | 0.00 | 0.04 | 0.04 | 5 | 8 | 0.40 | 0.20 | 0.58 | 0.37 |

12 | 4 | 0.03 | 0.01 | 0.10 | 0.09 | 5 | 9 | 0.21 | 0.09 | 0.32 | 0.24 |

12 | 5 | 0.07 | 0.03 | 0.24 | 0.19 | 5 | 10 | 0.09 | 0.04 | 0.16 | 0.13 |

12 | 6 | 0.14 | 0.06 | 0.46 | 0.32 | 9 | 0 | 0.21 | 0.09 | 0.73 | 0.42 |

12 | 7 | 0.26 | 0.11 | 0.80 | 0.44 | 9 | 1 | 0.66 | 0.39 | 1.59 | 0.61 |

12 | 8 | 0.42 | 0.21 | 1.24 | 0.55 | 9 | 2 | 0.99 | 0.91 | 1.77 | 0.64 |

12 | 9 | 0.62 | 0.34 | 1.73 | 0.63 | 9 | 3 | 0.86 | 0.59 | 1.33 | 0.57 |

12 | 10 | 0.81 | 0.53 | 2.21 | 0.69 | 9 | 4 | 0.49 | 0.26 | 0.74 | 0.43 |

9 | 5 | 0.21 | 0.09 | 0.32 | 0.24 | ||||||

9 | 6 | 0.06 | 0.03 | 0.11 | 0.10 | ||||||

9 | 7 | 0.01 | 0.01 | 0.03 | 0.03 |

#### 4.3. Hardy-Weinberg equilibrium law

_{1}and x

_{3}are the two homozigote sample counts and ${x}_{2}=n-{x}_{1}-{x}_{3}$ is hetherozigote sample count. $\theta =\left[{\theta}_{1},{\theta}_{2},{\theta}_{3}\right]$ is the parameter vector. The posterior density for this trinomial model is

#### 4.4. Independence test in a 2× 2 contingency table

x_{00} | x_{01} | x_{10} | x_{11} | Ev | pV | BF | PP |
---|---|---|---|---|---|---|---|

12 | 6 | 95 | 35 | 0.96 | 0.57 | 4.73 | 0.83 |

48 | 25 | 9 | 10 | 0.54 | 0.14 | 1.04 | 0.51 |

96 | 50 | 18 | 20 | 0.24 | 0.04 | 0.50 | 0.33 |

18 | 5 | 39 | 30 | 0.29 | 0.06 | 0.50 | 0.33 |

36 | 10 | 78 | 60 | 0.06 | 0.01 | 0.11 | 0.10 |

#### 4.5. Comparison of two gamma distributions

Brand 1 sample | ||||

39.27 | 31.72 | 12.33 | 27.67 | 56.66 |

28.32 | 53.72 | 29.71 | 23.76 | 33.55 |

mean_{1}=33.67 | std_{1}=13.33 | |||

Brand 2 sample | ||||

28.32 | 53.72 | 29.71 | 23.76 | 33.55 |

24.07 | 33.79 | 33.10 | 26.93 | 27.23 |

mean_{2}=29.25 | std_{2}=3.62 | |||

Evidence | ||||

$Ev(H\text{'})=0.89$ | $Ev(H)=0.01$ |

## 5. Final Remarks

## References and Notes

- Cox, D.R. The role of significance tests. Scand. J. Statist.
**1977**, 4, 49–70. [Google Scholar] - Berger, J.O.; Delampady, M. Testing precise hypothesis. Statistical Science
**1987**, 3, 315–352. [Google Scholar] - Berger, J.O.; Boukai, B.; Wang, Y. Unified frequentist and Bayesian testing of a precise hypothesis. Statistical Science
**1997**, 3, 315–352. [Google Scholar] - Pereira, C.A.B; Wechsler, S. On the concept of p-value. Braz. J. Prob. Statist.
**1993**, 7, 159–177. [Google Scholar] - Lindley, D.V. A statistical paradox. Biometrika
**1957**, 44, 187–192. [Google Scholar] [CrossRef] - Royall, R. Statistical Evidence: A Likelihood Paradigm; Chapman Hall: London, 1997; p. 191. [Google Scholar]
- Vieland, V.J.; Hodge, S.E. Book Reviews: Statistical Evidence by R Royall (1997). Am. J. Hum. Genet.
**1998**, 63, 283–289. [Google Scholar] - Good, I.J. Good thinking: The foundations of probability and its applications; University of Minnesota Press, 1983; p. 332. [Google Scholar]
- Fletcher, R. Practical Methods of Optimization; J Wiley: Essex, 1987; p. 436. [Google Scholar]
- Horst, R.; Pardalos, P.M.; Thoai, N. Introduction to Global Optimization; Kluwer Academic Publishers: Boston, 1995. [Google Scholar]
- Pintér, J.D. Global Optimization in Action. Continous and Lipschitz Optimization: Algorithms, Implementations ans Applications; Kluwer Academic Publishers: Boston, 1996. [Google Scholar]
- Krommer, A.R.; Ueberhuber, C.W. Computational Integration; SIAM: Philadelphia, 1998; p. 445. [Google Scholar]
- Nemhauser, G.L.; Rinnooy Kan, A.H.G.; Todd, M.J. Optimization, Handbooks in Operations Research; North-Holland: Amsterdam, 1989; Vol. 1, p. 709. [Google Scholar]
- Sloan, I.H.; Joe, S. Latice Methods for Multiple Integration; Oxford University Press: Oxford, 1994; p. 239. [Google Scholar]
- Aitkin, M. Posterior Bayes Factors. J. R. Statist. Soc. B.
**1991**, 1, 111–142. [Google Scholar] - Irony, T.Z.; Pereira, C.A.B. Exact test for equality of two proportions: Fisher×Bayes. J. Statist. Comp. Simulation
**1986**, 25, 93–114. [Google Scholar] - Irony, T.Z.; Pereira, C.A.B. Bayesian Hypothesis test: Using surface integrals to distribute prior information among hypotheses. Resenhas
**1986**, 2, 27–46. [Google Scholar] - Pereira, C.A.B.; Rogatko, A. The Hardy-Weinberg equilibrium under a Bayesian perspective. Braz. J. Genet.
**1984**, 7, 689–707. [Google Scholar] - Montoya-Delgado, L.E.; Irony, T.Z.; Pereira, C.A.B.; Whittle, M. Unconditional exact test for the Hardy-Weinberg law. Submitted for publication
**1998**. [Google Scholar] - Marshall, A.; Prochan, F. Classes of distributions applicable in replacement, with renewal theory implications. Proc. 6th Berkeley Symp. Math. Statist. Prob.
**1972**, 395–415. [Google Scholar] - Pereira, C.A.B.; Stern, J.M. A Dynamic Software Certification and Verification Procedure. Proc. ISAS’99 - International Conference on Informations System Analysis and Synthesis
**1999**, II, 426–435. [Google Scholar] - Lindley, D.V. The Bayesian approach. Scand. J. Statist.
**1978**, 5, 1–26. [Google Scholar] - Pereira, C.A.B.; Lindley, D.V. Examples questioning the use of partial likelihood. The Statistician
**1987**, 36, 15–20. [Google Scholar] [CrossRef]

© 1999 by MDPI (http://www.mdpi.org). Reproduction is permitted for noncommercial purposes.

## Share and Cite

**MDPI and ACS Style**

De Bragança Pereira, C.A.; Stern, J.M. Evidence and Credibility: Full Bayesian Significance Test for Precise Hypotheses. *Entropy* **1999**, *1*, 99-110.
https://doi.org/10.3390/e1040099

**AMA Style**

De Bragança Pereira CA, Stern JM. Evidence and Credibility: Full Bayesian Significance Test for Precise Hypotheses. *Entropy*. 1999; 1(4):99-110.
https://doi.org/10.3390/e1040099

**Chicago/Turabian Style**

De Bragança Pereira, Carlos Alberto, and Julio Michael Stern. 1999. "Evidence and Credibility: Full Bayesian Significance Test for Precise Hypotheses" *Entropy* 1, no. 4: 99-110.
https://doi.org/10.3390/e1040099