ebook img

Anders Hald A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713 to 1935 PDF

211 Pages·2005·1.65 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Anders Hald A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713 to 1935

Anders Hald A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713 to 1935 department of applied mathematics and statistics university of copenhagen department of applied mathematics and statistics university of copenhagen universitetsparken 5 dk-2100 copenhagen ø c Anders Hald ° 2004 ISBN 87-7834-628-2 Contents Preface v Chapter 1. The three revolutions in parametric statistical inference 1 1.1. Introduction 1 1.2. Laplace on direct probability, 1776-1799 1 1.3. The first revolution: Laplace 1774-1786 2 1.4. The second revolution: Gauss and Laplace 1809-1828 3 1.5. The third revolution: R. A. Fisher 1912-1956 5 Part 1. BINOMIAL STATISTICAL INFERENCE The three pioneers: Bernoulli (1713), de Moivre (1733) and Bayes (1764) 9 Chapter 2. James Bernoulli’s law of large numbers for the binomial, 1713, and its generalization 11 2.1. Bernoulli’s law of large numbers for the binomial, 1713 11 2.2. Remarks on further developments 13 Chapter 3. De Moivre’s normal approximation to the binomial, 1733, and its generalization 15 3.1. De Moivre’s normal approximation to the binomial, 1733 15 3.2. Lagrange’s multivariate normal approximation to the multinomial and his confidence interval for the binomial parameter, 1776 19 3.3. De Morgan’s continuity correction, 1838 21 Chapter 4. Bayes’s posterior distribution of the binomial parameter and his rule for inductive inference, 1764 23 4.1. The posterior distribution of the binomial parameter, 1764 23 4.2. Bayes’s rule for inductive inference, 1764 25 Part2. STATISTICALINFERENCEBYINVERSEPROBABILITY. Inverse probability from Laplace (1774), and Gauss (1809) to Edgeworth (1909) 27 Chapter 5. Laplace’s theory of inverse probability, 1774-1786 29 5.1. Biography of Laplace 29 5.2. The principle of inverse probability and the symmetry of direct and inverse probability, 1774 30 5.3. Posterior consistency and asymptotic normality in the binomial case, 1774 33 i ii 5.4. The predictive distribution, 1774-1786 35 5.5. Astatistical model andamethodofestimation. Thedoubleexponential distribution, 1774 36 5.6. The asymptotic normality of posterior distributions, 1785 38 Chapter 6. A nonprobabilistic interlude: The fitting of equations to data, 1750-1805 43 6.1. The measurement error model 43 6.2. The method of averages by Mayer, 1750, and Laplace, 1788 44 6.3. The method of least absolute deviations by Boscovich, 1757, and Laplace, 1799 45 6.4. The method of least squares by Legendre, 1805 47 Chapter 7. Gauss’s derivation of the normal distribution and the method of least squares, 1809 49 7.1. Biography of Gauss 49 7.2. Gauss’s derivation of the normal distribution, 1809 50 7.3. Gauss’s first proof of the method of least squares, 1809 52 7.4. Laplace’s large-sample justification of the method of least squares, 1810 53 Chapter 8. Credibility and confidence intervals by Laplace and Gauss 55 8.1. Large-sample credibility and confidence intervals for the binomial parameter by Laplace, 1785 and 1812 55 8.2. Laplace’s general method for constructing large-sample credibility and confidence intervals, 1785 and 1812 55 8.3. Credibility intervals for the parameters of the linear normal model by Gauss, 1809 and 1816 56 8.4. Gauss’s rule for transformation of estimates and its implication for the principle of inverse probability, 1816 57 8.5. Gauss’s shortest confidence interval for the standard deviation of the normal distribution, 1816 57 Chapter 9. The multivariate posterior distribution 59 9.1. Bienaymé’s distribution of a linear combination of the variables, 1838 59 9.2. PearsonandFilon’sderivationofthemultivariateposteriordistribution, 1898 59 Chapter 10. Edgeworth’s genuine inverse method and the equivalence of inverse and direct probability in large samples, 1908 and 1909 61 10.1. Biography of Edgeworth 61 10.2. The derivation of the t distribution by Lüroth, 1876, and Edgeworth, 1883 61 10.3. Edgeworth’s genuine inverse method, 1908 and 1909 63 Chapter 11. Criticisms of inverse probability 65 11.1. Laplace 65 11.2. Poisson 67 11.3. Cournot 68 iii 11.4. Ellis, Boole and Venn 69 11.5. Bing and von Kries 70 11.6. Edgeworth and Fisher 71 Part 3. THE CENTRAL LIMIT THEOREM AND LINEAR MINIMUM VARIANCE ESTIMATION BY LAPLACE AND GAUSS 73 Chapter 12. Laplace’s central limit theorem and linear minimum variance estimation 75 12.1. The central limit theorem, 1810 and 1812 75 12.2. Linear minimum variance estimation, 1811 and 1812 77 12.3. Asymptotic relative efficiency of estimates, 1818 79 12.4. Generalizations of the central limit theorem 81 Chapter 13. Gauss’s theory of linear minimum variance estimation 85 13.1. The general theory, 1823 85 13.2. Estimation under linear constraints, 1828 87 13.3. A review of justifications for the method of least squares 88 13.4. The state of estimation theory about 1830 90 Part 4. ERROR THEORY. SKEW DISTRIBUTIONS. CORRELATION. SAMPLING DISTRIBUTIONS 93 Chapter 14. The development of a frequentist error theory 95 14.1. The transition from inverse to frequentist error theory 95 14.2. Hagen’s hypothesis of elementary errors and his maximum likelihood argument, 1837 96 14.3. Frequentist error theory by Chauvenet, 1863, and Merriman, 1884 97 Chapter 15. Skew distributions and the method of moments 101 15.1. The need for skew distributions 101 15.2. Series expansions of frequency functions. The A and B series 102 15.3. Biography of Karl Pearson 107 15.4. Pearson’s four-parameter system of continuous distributions, 1895 109 15.5. Pearson’s χ2 test for goodness of fit, 1900 111 15.6. The asymptotic distribution of the moments by Sheppard, 1899 113 15.7. Kapteyn’s derivation of skew distributions, 1903 114 Chapter 16. Normal correlation and regression 117 16.1. Some early cases of normal correlation and regression 117 16.2. Galton’s empirical investigations of regression and correlation, 1869-1890 120 16.3. The mathematization of Galton’s ideas by Edgeworth, Pearson and Yule 125 16.4. Orthogonal regression. The orthogonalization of the linear model 130 Chapter 17. Sampling distributions under normality, 1876-1908 133 iv 17.1. The distribution of the arithmetic mean 133 17.2. The distribution of the variance and the mean deviation by Helmert, 1876 133 17.3. Pizzetti’s orthonormal decomposition of the sum of squared errors in the linear-normal model, 1892 136 17.4. Student’s t distribution by Gosset, 1908 137 Part 5. THE FISHERIAN REVOLUTION, 1912-1935 141 Chapter 18. Fisher’s early papers, 1912-1921 143 18.1. Biography of Fisher 143 18.2. Fisher’s “absolute criterion”, 1912 147 18.3. The distribution of the correlation coefficient, 1915, its transform, 1921, with remarks on later results on partial and multiple correlation148 18.4. The sufficiency of the sample variance, 1920 155 Chapter 19. The revolutionary paper, 1922 157 19.1. The parametric model and criteria of estimation, 1922 157 19.2. Properties of the maximum likelihood estimate 159 19.3. The two-stage maximum likelihood method and unbiasedness 163 Chapter 20. Studentization, the F distribution and the analysis of variance, 1922-1925 165 20.1. Studentization and applications of the t distribution 165 20.2. The F distribution 167 20.3. The analysis of variance 168 Chapter 21. The likelihood function, ancillarity and conditional inference 173 21.1. The amount of information, 1925 173 21.2. Ancillarity and conditional inference 173 21.3. The exponential family of distributions, 1934 174 21.4. The likelihood function 174 Epilogue 175 Terminology and notation 176 Books on the history of statistics 177 Books on the history of statistical ideas 177 References 178 Subject index 195 Author index 199 Preface This is an attempt to write a history of parametric statistical inference. It may be used as the basis for a course in this important topic. It should be easy to read for anybody having taken an elementary course in probability and statistics. The reader wanting more details, more proofs, more references and more in- formation on related topics may find so in my previous two books: A History of Probability and Statistics and Their Applications before 1750, Wiley, 1990, and A History of Mathematical Statistics from 1750 to 1930, Wiley, 1998. The text contains a republication of pages 488-489, 494-496, 612-618, 620-626, 633-636, 652-655, 670-685, 713-720, and 734-738 from A. Hald: A History of Math- ematical Statistics from 1750 to 1930, Copyright c 1998 by John Wiley & Sons, ° Inc. This material is used by permission of John Wiley & Sons, Inc. I thank my granddaughter Nina Hald for typing the first version of the manu- script. September 2003 Anders Hald I thank Professor Søren Johansen, University of Copenhagen, for a thorough discussion of the manuscript with me. I thank Professor Michael Sørensen, Department of Applied Mathematics and Statistics, University of Copenhagen for including my book in the Department’s series of publications. December 2004 Anders Hald v vi James Bernoulli (1654-1705) Abraham de Moivre (1667-1754) Pierre Simon Laplace (1749-1827) vii Carl Frederich Gauss (1777-1855) Ronald Aylmer Fisher (1890-1962)

Description:
Probability and Statistics and Their Applications before 1750, Wiley, 1990, and A. History of Mathematical Statistics from 1750 to 1930, Wiley, 1998.
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.