Sampling-Based Approaches to Calculating Marginal Densities by Gelfand A.
and Smith A.F.M. JASA 1990[pdf]
Explaining the Gibbs Sampler by Casella G. and George E.I. The American
Statistician 1992, vol 46, pp:167-174
Understanding the Metropolis Algorithm by Chib S. and Greenberg E. The
American Statistician 1995, vol 49, pp:327-335
7. Re-analyses of the data sets used in the papers below by implementing
Gibbs Sampling and Metropolis Algorithm [ps][pdf]
Illustration of Bayesian Inference in Normal Data Models Using Gibbs
Sampling by Gelfand A.E., Hills S.E., Racine-Poon A., Smith A.F.M., JASA, Vol.
85, pp. 972-985.[ps]
A Generalization of the Probit and Logit Methods for Dose Response Curves
Prentice P., Biometrics, Vol. 32, pp. 761-768.[ps]
8. Bayesian linear regression analysis, hierarchical linear regression
models, Bayesian variable selection [ps][pdf]
Variable Selection Via Gibbs Sampling George E.I. and McCulloch R.E. JASA
Vol.88 pp. 881-889[ps]
9. Generalized linear models: hierarchical logistic regression, hierarchical
log-linear regression
Bayesian Analyses of the rat tumor data and of air pollution and mortality
national data base [ps][pdf]
Introduction to BUGS
(WinBUGS 1.3 for Windows, Classic BUGS 0.603 for Sparc stations).
One-parameter models: 1) normal with known variance (R and BUGS); and 2) Poisson
(BUGS). [lab2.pdf]
[slides2.pdf]
[norm1.b]
[pois.b]
Approximating the Posterior Distribution of all Unknown Parameters under a
Hierarchical Logistic Model: Estimating the risk of tumor in a group of rats
[hlogistic.S]