Bayesian Statistics (4th ed)

Home > Other > Bayesian Statistics (4th ed) > Page 33
Bayesian Statistics (4th ed) Page 33

by Peter M Lee

X has a z distribution on and degrees of freedom, denoted

  if , or equivalently if it has the density

  Another definition is that if , then

  The mean and variance are easily deduced from those of the log chi-squared distribution; they are

  where is (as above) the digamma function, or approximately

  The mode is zero.

  Unless and are very small, the distribution of z is approximately normal. The z distribution was introduced by Fisher (1924).

  A.21 Cauchy distribution

  X has a Cauchy distribution with location parameter μ and scale parameter , denoted

  if

  Because the relevant integral is not absolutely convergent, this distribution does not have a finite mean, nor a fortiori, a finite variance. However, it is symmetrical about μ, and hence

  The distribution function is

  so that when then F(x)=1/4 and when then F(x)=3/4. Thus, and are, respectively, the lower and upper quartiles, and hence σ may be thought of as the semi-interquartile range. Note that for a normal distribution the semi-interquartile range is rather than σ.

  It may be noted that the C(0, 1) distribution is also the Student’s t distribution on 1 degree of freedom

  A.22 The probability that one beta variable is greater than another

  Suppose π and ρ have independent beta distributions

  Then

  [see Altham (1969)]. For an expression (albeit a complicated one) for

  see Weisberg (1972).

  When α, β, γ and are large we can approximate the beta variables by normal variates of the same means and variances and hence approximate the distribution of by a normal distribution.

  A.23 Bivariate normal distribution

  The ordered pair (X, Y)T of observations has a bivariate normal distribution, denoted

  if it has joint density

  where

  The means and variances are

  and X and Y have correlation coefficient and covariance

  Most properties follow from those of the ordinary, univariate and normal distribution. One point worth noting (which is clear from the form of the joint density function) is that if X and Y have a bivariate normal distribution, then they are independent if they are uncorrelated (a result which is not true in general).

  A.24 Multivariate normal distribution

  An n-dimensional random vector has a multivariate normal distribution with mean vector and variance–covariance matrix , denoted

  if it has joint density function

  It can be checked by finding the determinant of the variance–covariance matrix

  and inverting it that the bivariate normal distribution is a special case.

  A.25 Distribution of the correlation coefficient

  If the prior density of the correlation coefficient is ρ, then its posterior density, given n pairs of observations (Xi, Yi) with sample correlation coefficient r, is given by

  on writing . It can also be shown that

  When r=0 the density simplifies to

  and so if the prior is of the form

  it can be shown that

  has a Student’s t distribution on degrees of freedom.

  Going back to the general case, it can be shown that

  Expanding the term in square brackets as a power series in u, we can express the last integral as a sum of beta functions. Taking only the first term, we have as an approximation

  On writing

  it can be shown that

  and hence that for large n

  approximately [whatever is]. A better approximation is

  Appendix B: Tables

  Table B.1 Percentage points of the Behrens–Fisher distribution.

  Table B.2 Highest density regions for the chi-squared distribution.

  Table B.3 HDRs for the inverse chi-squared distribution.

  Table B.4 Chi-squared corresponding to HDRs for log chi-squared.

  Table B.5 Values of F corresponding to HDRs for log F.

  Appendix C: R programs

  Appendix D: Further reading

  D.1 Robustness

  Although the importance of robustness (or sensitivity analysis) was mentioned at the end of Section 2.3 on several normal means with a normal prior, not much attention has been devoted to this topic in the rest of the book. Some useful references are Berger (1985, Section 4.7), Box and Tiao (1992, Section 3.2 and passim.), Hartigan (1983, Chapter ), Kadane (1984) and O’Hagan and Forster (2004, Chapter ).

  D.2 Nonparametric methods

  Throughout this book, it is assumed that the data we are analyzing comes from some parametric family, so that the density p(x|θ) of any observation x depends on one or more parameters θ (e.g. x is normal of mean θ and known variance). In classical statistics, much attention has been devoted to developing methods which do not make any such assumption, so that you can, for example, say something about the median of a set of observations without assuming that they come from a normal distribution. Some attempts have been made to develop a Bayesian form of nonparametric theory, though this is not easy as it involves setting up a prior distribution over a very large class of densities for the observations. Useful references are Ferguson (1973), Florens et al. (1983), Dalal (1980), Hill (1988), Lenk (1991), Ghosh and Ramamoorthi (2003) and Hjort et al. (2010). A brief account is given by Müller and Quintana (2004).

  D.3 Multivariate estimation

  In order to provide a reasonably simple introduction to Bayesian statistics, avoiding matrix theory as far as possible, the coverage of this book has been restricted largely to cases where only one measurement is taken at a time. Useful references for multivariate Bayesian statistics are Box and Tiao (1992, Chapter ), Zellner (1971, Chapter ), Press (2009) and O’Hagan and Forster (2004, Sections 10.28–10.41).

  D.4 Time series and forecasting

  Methods of dealing with time series, that is, random functions of time, constitute an important area of statistics. Important books on this area from the Bayesian standpoint are West and Harrison (1989) and Pole et al. (1994). Information about software updates can be found at

  A briefer discussion of some of the ideas can be found in Leonard and Hsu (2001, Section 5.3).

  D.5 Sequential methods

  Some idea as to how to apply Bayesian methods in cases where observations are collected sequentially through time can be got from Berger (1985, Chapter ) or O’Hagan and Forster (2004, Sections 3.55–3.57).

  D.6 Numerical methods

  This is the area in which most progress in Bayesian statistics has been made in recent years. Although Chapter is devoted to numerical methods, a mere sketch of the basic ideas has been given. Very useful texts with a wealth of examples and full programs in WinBUGS available on an associated website are Congdon (2002, 2005, 2006 and 2010). Those seriously interested in the application of Bayesian methods in the real world should consult Tanner (1993), Gelman et al. (2004), Carlin and Louis (2008), Gilks et al. (1996), French and Smith (1997) and Brooks (1998).

  More recently books giving a useful and comprehensible treatment of Bayesian numerical methods using and WinBUGS include Albert (2009) (a particularly attractive treatment), Marin and Robert (2007), Robert and Casella (2010) and Ntzoufras (2009).

  At a much lower level, Albert (1994) is a useful treatment of elementary Bayesian ideas using Minitab.

  D.7 Bayesian networks

  References on this topic (on which I am not an expert) include Jensen (1996), Jensen and Nielson (2010) and Neapolitan (2004).

  D.8 General reading

  Apart from Jeffreys (1939, 1948 and 1961), Berger (1985) and Box and Tiao (1992), which have frequently been referred to, some useful references are Lindley (1971a), DeGroot (1970) and Raiffa and Schlaifer (1961). The more recent texts by Bernardo and Smith (1994) and O’Hagan and Forster (2004) are very important and give a good coverage of the Bayesian theory. Some useful coverage of Bayesian methods for the linear model can be found in Broemling (1985). Linear Bayes methods are cove
red in Goldstein and Wooff (2007). Anyone interested in Bayesian statistics will gain a great deal by reading de Finetti (1972 and 1974–1975) and Savage (1972 and 1981). A useful collection of essays on the foundations of Bayesian statistics is Kyburg and Smokler (1964 and 1980), and a collection of recent influential papers can be found in Polson and Tiao (1995). The Valencia symposia edited by Bernardo et al. (1980–2011) and the `case studies’ edited by Gatsonis et al. (1993–2002) contain a wealth of material. A comparison of Bayesian and other approaches to statistical inference is provided by Barnett (1982). Nice recent textbook treatments at a lower level than this book can be found in Berry (1996) and Bolstad (2007).

  A very nice book giving a treatment of Bayesian methods of great interest both to the layman and to the specialist is McGrayne (2011).

  References

  Abramowitz, M., and Stegun, M. A., Handbook of Mathematical Functions, Washington, DC: National Bureau of Standards (1964); New York: Dover (1965).

  Aitken, C. G. G. Lies, damned lies and expert witnesses, Mathematics Today: Bull. Inst. Math. Appl., 32 (5/6) (1996), 76–80.

  Aitken, C. G.G., and Taroni, F., Statistics and the Evaluation of Evidence for Forensic Scientists, New York: John Wiley & Sons (2004) [1st edn by Aitken alone (1995)].

  Albert, J. H., Bayesian Computation Using Minitab, Belmont, CA: Duxbury (1994).

  Albert, J. H., Bayesian Computation with R (2nd edn), New York: Springer-Verlag 2009 [1st edn 2007].

  Altham, P. M. E., Exact Bayesian analysis of a 2 × 2 contingency table and Fisher's `exact’ significance test, J. Roy. Statist. Soc. Ser. B, 31 (1969), 261–269.

  Arbuthnot, J., An argument for Divine Providence taken from the constant Regularity of the Births of Both Sexes, Phil. Trans. Roy. Soc. London, 23 (1710), 186–190 [reprinted in Kendall and Plackett (1977)].

  Armitage, P., Berry, G., and Matthews, J. N. S., Statistical Methods in Medical Research (4th edn), Oxford: Blackwells (2001) [1st edn, by Armitage alone (1971); 2nd edn (1987) and 3rd edn (1994) by Armitage and Berry (1987)].

  Arnold, B. C., Pareto Distributions, Fairland, MD: International Co-operative Publishing House (1983).

  Aykaç, A., and Brumat, C. (eds), New Methods in the Applications of Bayesian Methods, Amsterdam: North-Holland (1977).

  Balding, D. J., and Donnelly, P., Inference in forensic identification, J. Roy. Statist. Soc. Ser. A, 158 (1995), 21–53.

  Baird, R. D., Experimentation: An Introduction to Measurement Theory and Experiment Design, Englewood Cliffs, MD: Prentice-Hall (1962).

  Balakrishnan, N., Kotz, S., and Johnson, N. L., Continuous Multivariate Distributions: Models and Applications, New York: John Wiley & Sons (2012) [previous edition by Johnson and Kotz alone (1972); this book overlaps with Fang, Kotz and Wang (1989)].

  Barnard, G. A., Thomas Bayes's essay towards solving a problem in the doctrine of chances, Biometrika, 45 (1958), 293–315 [reprinted in Pearson and Kendall (1970)].

  Barnett, V., Comparative Statistical Inference (3rd edn), New York: John Wiley & Sons (1999) [1st edn (1973), 2nd edn (1982)].

  Barnett, V. D., Evaluation of the maximum-likelihood estimator where the likelihood equation has multiple roots, Biometrika, 53 (1966), 151–165.

  Bartlett, M. S., The information available in small samples, Proc. Cambridge Philos. Soc., 32 (1936), 560–566.

  Bartlett, M. S., A comment on D. V. Lindley's statistical paradox, Biometrika, 44 (1957), 533–534.

  Batschelet, E., Circular Statistics in Biology, London: Academic Press (1981).

  Bayes, T. R., An essay towards solving a problem in the doctrine of chances, Phil. Trans. Roy. Soc. London, 53 (1763), 370–418 [reprinted as part of Barnard (1958) and Pearson and Kendall (1970)]; see also Price (1764).

  Beaumont, M. A., Zhang, W., and Balding, D. J., Approximate Bayesian computation in population genetics, Genetics, 162 (2002), 2025–2035.

  Behrens, W. A., Ein Betrag zur Fehlenberechnung bei wenigen Beobachtungen, Landwirtschaftliche Jahrbücher, 68 (1929), 807–837.

  Bellhouse, D. R., The Reverend Thomas Bayes, FRS: A biography to celebrate the tercentenary of his birth (with discussion), Statistical Science, 19 (2004), 3–43.

  Bellhouse, D. R. et al., Notes about the Rev. Thomas Bayes, Bull. Inst. Math. Stat., 17 (1988), 49, 276–278, 482–483; 19 (1990), 478–479; 20 (1991), 226; 21 (1992), 225–227.

  Benford, F., The law of anomalous numbers, Proc. Amer. Philos. Soc., 78 (1938), 551–572.

  Berger, J. O., A robust generalized Bayes estimator and confidence region for a multivariate normal mean, Ann. Statist., 8 (1980), 716–761.

  Berger, J. O., Statistical Decision Theory and Bayesian Analysis (2nd edn), Berlin: Springer-Verlag (1985) [1st edn published as Statistical Decision Theory: Foundations, Concepts and Methods, Berlin: Springer-Verlag (1980)].

  Berger, J. O., and Delampady, M., Testing precise hypotheses (with discussion), Statistical Science, 2 (1987), 317–352.

  Berger, J. O., and Wolpert, R. L., The Likelihood Principle (2nd edn), Hayward, CA: Institute of Mathematical Statistics (1988) [1st edn (1984)].

  Bernardo, J. M., Reference posterior distributions for Bayesian inference (with discussion), J. Roy. Statist. Soc. Ser. B, 41 (1979), 113–147.

  Bernardo, J. M., Bayarri, M. J., Berger, J. O., Dawid, A. P., Heckermann, D., Smith, A. F. M., and West, M. (eds), Bayesian Statistics 8, Oxford: Oxford University Press (2007).

  Bernardo, J. M., Bayarri, M. J., Berger, J. O., Dawid, A. P., Heckermann, D., Smith, A. F. M., and West, M. (eds), Bayesian Statistics 9, Oxford: Oxford University Press (2011).

  Bernardo, J. M., Berger, J. M., Dawid, A. P., Smith, A. F. M., and DeGroot, M. H. (eds), Bayesian Statistics 4, Oxford: Oxford University Press (1992).

  Bernardo, J. M., Berger, J. M., Dawid, A. P., and Smith, A. F. M. (eds), Bayesian Statistics 5, Oxford: Oxford University Press (1996).

  Bernardo, J. M., Berger, J. O., Dawid, A. P., and Smith, A. F. M. (eds), Bayesian Statistics 6, Oxford: Oxford University Press (1999).

  Bernardo, J. M., Dawid, A. P., Berger, J. O., West, M., Heckermann, D., and Bayarri, M. J. (eds), Bayesian Statistics 7, Oxford: Oxford University Press (2003).

  Bernardo, J. M., DeGroot, M. H., Lindley, D. V., and Smith, A. F. M. (eds), Bayesian Statistics, Valencia: Valencia University Press (1980).

  Bernardo, J. M., DeGroot, M. H., Lindley, D. V., and Smith, A. F. M. (eds), Bayesian Statistics 2, Amsterdam: North-Holland and Valencia: Valencia University Press (1985).

  Bernardo, J. M., DeGroot, M. H., Lindley, D. V., and Smith, A. F. M. (eds), Bayesian Statistics 3, Oxford: Oxford University Press (1988).

  Bernardo, J. M., and Smith, A. F. M., Bayesian Theory, New York, NY: John Wiley & Sons (1994).

  Berry, D. A., Statistics: A Bayesian Perspective, Belmont, CA: Duxbury (1996).

  Besag, J., On the statistical analysis of dirty pictures (with discussion), J. Roy. Statist. Soc. Ser. B, 48 (1986), 259–302.

  Birnbaum, A., On the foundations of statistical inference (with discussion), J. Amer. Statist. Assoc., 57 (1962), 269–306.

  Bliss, C. I., The dosage of the dosage-mortality curve, Annals of Applied Biology, 22 (1935), 134–167.

  Blum, M. G. B., and François, O., Non-linear regression models for Approximate Bayesian Computation, Statistics and Computing 20 (2010), 63–73.

  Bolstad, W. M., Introduction to Bayesian Statistics (2nd edn), Chichester: John Wiley & Sons (2007) [1st edn (2004)].

  Bortkiewicz, L. von, Das Gesetz der Kleinen Zahlenen, Leipzig: Teubner (1898).

  Box, G. E. P., Hunter, W. G., and Hunter, J. S., Statistics for Experimenters, New York: John Wiley & Sons (1978).

  Box, G. E. P., and Tiao, G. C., Bayesian Inference in Statistical Analysis, New York: John Wiley & Sons (1992) [1st edn (1973)].

  Breiman, L., Probability, Reading, MA: Addison-Wesley (1968).

  British Association for the Advancement of Science, Mathematical Tables, Vol. VI: Bessel Functions, Part I, Functions of Order
Zero and Unity, Cambridge: Cambridge University Press (1937).

  Broemling, L. D., Bayesian Analysis of Linear Models, Basel: Marcel Dekker (1985).

  Brooks S. P., Markov chain Monte Carlo method and its application, The Statistician: J. Roy. Statist. Soc. Ser. D, 47 (1998), 69–100.

  Brooks, S., Gelman, A., Jones, G. L., and Meng, X. L., Handbook of Markov Chain Monte Carlo. Boca Racon, FL: CRC Press (2011).

  Buck, C. E., Cavanagh, W. G., and Litton, C. D., Bayesian Approach to Interpreting Archaeological Data, New York: John Wiley & Sons (1996).

  Calvin, T. W., How and When to Perform Bayesian Acceptance Sampling (ASQC Basic References in Quality Control: Statistical Techniques, Volume 7), Milwaukee, WI: American Society for Quality Control (1984).

  Carlin, B. P., Gelfand, A. E., and Smith, A. F. M., Hierarchical Bayesian analysis of changepoint problems, Applied Statistics, 41 (1992), 389–405.

  Carlin, B. P., and Louis, T. A., Bayes and Empirical Bayes Methods for Data Analysis (3rd edn), London: Chapman and Hall (2008) [1st edn (1994), 2nd edn (2000)].

  Casella, G., and George, E., Explaining the Gibbs sampler, American Statistician, 46 (1992), 167–174.

 

‹ Prev