Lavoisier S.A.S.
14 rue de Provigny
94236 Cachan cedex
FRANCE

Heures d'ouverture 08h30-12h30/13h30-17h30
Tél.: +33 (0)1 47 40 67 00
Fax: +33 (0)1 47 40 67 02


Url canonique : www.lavoisier.fr/livre/mathematiques/computer-age-statistical-inference-student-edition/descriptif_4497465
Url courte ou permalien : www.lavoisier.fr/livre/notice.asp?ouvrage=4497465

Computer Age Statistical Inference, Student Edition Algorithms, Evidence, and Data Science Institute of Mathematical Statistics Monographs Series

Langue : Anglais

Auteurs :

Couverture de l’ouvrage Computer Age Statistical Inference, Student Edition
The twenty-first century has seen a breathtaking expansion of statistical methodology, both in scope and influence. 'Data science' and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? How does it all fit together? Now in paperback and fortified with exercises, this book delivers a concentrated course in modern statistical thinking. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov Chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. Each chapter ends with class-tested exercises, and the book concludes with speculation on the future direction of statistics and data science.
Part I. Classic Statistical Inference: 1. Algorithms and inference; 2. Frequentist inference; 3. Bayesian inference; 4. Fisherian inference and maximum likelihood estimation; 5. Parametric models and exponential families; Part II. Early Computer-Age Methods: 6. Empirical Bayes; 7. James–Stein estimation and ridge regression; 8. Generalized linear models and regression trees; 9. Survival analysis and the EM algorithm; 10. The jackknife and the bootstrap; 11. Bootstrap confidence intervals; 12. Cross-validation and Cp estimates of prediction error; 13. Objective Bayes inference and Markov chain Monte Carlo; 14. Statistical inference and methodology in the postwar era; Part III. Twenty-First-Century Topics: 15. Large-scale hypothesis testing and false-discovery rates; 16. Sparse modeling and the lasso; 17. Random forests and boosting; 18. Neural networks and deep learning; 19. Support-vector machines and kernel methods; 20. Inference after model selection; 21. Empirical Bayes estimation strategies; Epilogue; References; Author Index; Subject Index.
Bradley Efron is Max H. Stein Professor, Professor of Statistics, and Professor of Biomedical Data Science at Stanford University. He has held visiting faculty appointments at Harvard, UC Berkeley, and Imperial College London. Efron has worked extensively on theories of statistical inference, and is the inventor of the bootstrap sampling technique. He received the National Medal of Science in 2005, the Guy Medal in Gold of the Royal Statistical Society in 2014, and the International Prize in Statistics in 2019.
Trevor Hastie is John A. Overdeck Professor, Professor of Statistics, and Professor of Biomedical Data Science at Stanford University. He is coauthor of The Elements of Statistical Learning (2009), a key text in the field of modern data analysis. He is also known for his work on generalized additive models, and for his contributions to the R computing environment. Hastie was elected to the National Academy of Sciences in 2018, received the Sigillum Magnum from the University of Bologna in 2019, and the Leo Breiman award from the American Statistical Association in 2020.

Date de parution :

Ouvrage de 506 p.

15.2x22.8 cm

Disponible chez l'éditeur (délai d'approvisionnement : 14 jours).

39,35 €

Ajouter au panier

Thème de Computer Age Statistical Inference, Student Edition :