扫一扫
关注中图网
官方微博
本类五星书更多>
-
>
宇宙、量子和人类心灵
-
>
考研数学专题练1200题
-
>
希格斯:“上帝粒子”的发明与发现
-
>
神农架叠层石:10多亿年前远古海洋微生物建造的大堡礁
-
>
二十四史天文志校注(上中下)
-
>
声音简史
-
>
浪漫地理学:追寻崇高景观
Bayesian data analysis 版权信息
- ISBN:9787519261818
- 条形码:9787519261818 ; 978-7-5192-6181-8
- 装帧:一般胶版纸
- 册数:暂无
- 重量:暂无
- 所属分类:>>
Bayesian data analysis 内容简介
这本书获得了2016年德格鲁特奖(De Groot Prize),两年一次的统计教科书奖。 这是第三版,这部教材已经相当经典了。是一部被广泛认可的关于贝叶斯方法的*领先的读本,因为其易于理解、分析数据和解决研究问题的实际操作性强而广受赞誉。贝叶斯数据分析,第三版应用*新的贝叶斯方法,继续采用实用的方法来分析数据。作者均是统计界的领导人物,在呈现更高等的方法之前,从数据分析的观点引进基本概念。整本书从始至终,从实际应用和研究中提取的大量的练习实例强调了贝叶斯推理在实践中的应用。
Bayesian data analysis 目录
Preface
Part I: Fundamentals of Bayesian Inference
1 Probability and inference
I.I The three steps of Bayesian data analysis
1.2 General notation for statistical inference
1.3 Bayesian inference
1.4 Discrete examples: genetics and spell checking
1.5 Probability as a measure of uncertainty
1.6 Example: probabilities from football point spreads
1.7 Example: calibration for record linkage
1.8 Some useful results from probability theory
1.9 Computation and software
I.I0 Bayesian inference in applied statistics
i.Ii Bibliographic note
1.12 Exercises
2 Single-parameter models
2.1 Estimating a probability from binomial data
2.2 Posterior as compromise between data and prior information
2.3 Summarizing posterior inference
2.4 Informative prior distributions
2.5 Normal distribution with known variance
2.6 Other standard single-parameter models
2.7 Example: informative prior distribution for cancer rates
2.8 Noninformative prior distributions
2.9 Weakly informative prior distributions
2.10 Bibliographic note
2.11 Exercises
3 Introduction to multiparameter models
3.1 Averaging over 'nuisance parameters'
3.2 Normal data with a noninformative prior distribution
3.3 Normal data with a conjugate prior distribution
3.4 Multinomial model for categorical data
3.5 Multivariate normal model with known variance
3.6 Multivariate normal with unknown mean and variance
3.7 Example: analysis of a bioassay experiment
3.8 Summary of elementary modeling and computation
3.9 Bibliographic note
3.10 Exercises
4 Asymptotics and connections to non-Bayesian approaches
4.1 Normal approximations to the posterior distribution
4.2 Large-sample theory
4.3 Counterexamples to the theorems
4.4 Frequency evaluations of Bayesian inferences
4.5 Bayesian interpretations of other statistical methods
4.6 Bibliographic note
4.7 Exercises
5 Hierarchical models
5.1 Constructing a parameterized prior distribution
5.2 Exchangeability and hierarchical models
5.3 Bayesian analysis of conjugate hierarchical models
5.4 Normal model with exchangeable parameters
5.5 Example: parallel experiments in eight schools
5.6 Hierarchical modeling applied to a meta-analysis
5.7 Weakly informative priors for variance parameters
5.8 Bibliographic note
5.9 Exercises
Part II: Fundamentals of Bayesian Data Analysis
6 Model checking
6.1 The place of model checking in applied Bayesian statistics
6.2 Do the inferences from the model make sense?
6.3 Posterior predictive checking
6.4 Graphical posterior predictive checks
6.5 Model checking for the educational testing example
6.6 Bibliographic note
6.7 Exercises
? Evaluating, comparing, and expanding models
7.1 Measures of predictive accuracy
7.2 Information criteria and cross-validation
7.3 Model comparison based on predictive performance
7.4 Model comparison using Bayes factors
7.5 Continuous model expansion
7.6 Implicit assumptions and model expansion: an example
7.7 Bibliographic note
7.8 Exercises
8 Modeling accounting for data collection
8.1 Bayesian inference requires a model for data collection
8.2 Data-collection models and ignorability
8.3 Sample surveys
8.4 Designed experiments
8.5 Sensitivity and the role of randomization
8.6 Observational studies
8.7 Censoring and truncation
8.8 Discussion
8.9 Bibliographic note
8.10 Exercises
9 Decision analysis
9.1 Bayesian decision theory in different contexts
9.2 Using regression predictions: survey incentives
9.3 Multistage decision making: medical screening
9.4 Hierarchical decision analysis for home radon
9.5 Personal vs. institutional decision analysis
9.6 Bibliographic note
9.7 Exercises
Part III: Advanced Computation
10 Introduction to Bayesian computation
10.1 Numerical integration
10.2 Distributional approximations
10.3 Direct simulation and rejection sampling
10.4 Importance sampling
10.5 How many simulation draws are needed?
10.6 Computing environments
10.7 Debugging Bayesian computing
10.8 Bibliographic note
10.9 Exercises
11 Basics of Markov chain simulation
11.1 Gibbs sampler
11.2 Metropolis and Metropolis-Hastings algorithms
11.3 Using Gibbs and Metropolis as building blocks
11.4 Inference and assessing convergence
11.5 Effective number of simulation draws
11.6 Example: hierarchical normal model
11.7 Bibliographic note
11.8 Exercises
12 Computationally efficient Markov chain simulation
12.1 Efficient Gibbs samplers
12.2 Efficient Metropolis jumping rules
12.3 Further extensions to Gibbs and Metropolis
12.4 Hamiltonian Monte Carlo
12.5 Hamiltonian Monte Carlo for a hierarchical model
12.6 Stan: developing a computing environment
12.7 Bibliographic note
12.8 Exercises
13 Modal and distributional approximations
13.1 Finding posterior modes
13.2 Boundary-avoiding priors for modal summaries
13.3 Normal and related mixture approximations
13.4 Finding marginal posterior modes using EM
13.5 Conditional and marginal posterior approximations
13.6 Example: hierarchical normal model (continued)
13.7 Variational inference
13.8 Expectation propagation
13.9 Other approximations
13.10 Unknown normalizing factors
13.11 Bibliographic note
13.12 Exercises
Part IV: Regression Models
14 Introduction to regression models
14.1 Conditional modeling
14.2 Bayesian analysis of classical regression
14.3 Regression for causal inference: incumbency and voting
14.4 Goals of regression analysis
14.5 Assembling the matrix of explanatory variables
14.6 Regularization and dimension reduction
14.7 Unequal variances and correlations
14.8 Including numerical prior information
14.9 Bibliographic note
14.10 Exercises
15 Hierarchical linear models
15.1 Regression coefficients exchangeable in batches
15.2 Example: forecasting U.S. presidential elections
15.3 Interpreting a normal prior distribution as extra data
15.4 Varying intercepts and slopes
15.5 Computation: batching and transformation
15.6 Analysis of variance and the batching of coefficients
15.7 Hierarchical models for batches of variance components
15.8 Bibliographic note
15.9 Exercises
16 Generalized linear models
16.1 Standard generalized linear model likelihoods
16.2 Working with generalized linear models
16.3 Weakly informative priors for logistic regression
16.4 Overdispersed Poisson regression for police stops
16.5 State-level opinons from national polls
16.6 Models for multivariate and multinomial responses
16.7 Loglinear models for multivariate discrete data
16.8 Bibliographic note
16.9 Exercises
17 Models for robust inference
17.1 Aspects of robustness
17.2 Overdispersed versions of standard models
17.3 Posterior inference and computation
17.4 Robust inference for the eight schools
17.5 Robust regression using t-distributed errors
17.6 Bibliographic note
17.7 Exercises
18 Models for missing data
18.1 Notation
18.2 Multiple imputation
18.3 Missing data in the multivariate normal and t models
18.4 Example: multiple imputation for a series of polls
18.5 Missing values with counted data
18.6 Example: an opinion poll in Slovenia
18.7 Bibliographic note
18.8 Exercises
Part V: Nonlinear and Nonparametric Models
19 Parametric nonlinear models
19.1 Example: serial dilution assay
19.2 Example: population toxicokinetics
19.3 Bibliographic note
19.4 Exercises
20 Basis function models
20.1 Splines and weighted sums of basis functions
20.2 Basis selection and shrinkage of coefficients
20.3 Non-normal models and regression surfaces
20.4 Bibliographic note
20.5 Exercises
21 Gaussian process models
21.1 Gaussian process regression
21.2 Example: birthdays and birthdates
21.3 Latent Gaussian process models
21.4 Functional data analysis
21.5 Density estimation and regression
21.6 Bibliographic note
21.7 Exercises
22 Finite mixture models
22.1 Setting up and interpreting mixture models
22.2 Example: reaction times and schizophrenia
22.3 Label switching and posterior computation
22.4 Unspecified number of mixture components
22.5 Mixture models for classification and regression
22.6 Bibliographic note
22.7 Exercises
23 Dirichlet process models
23.1 Bayesian histograms
23.2 Dirichlet process prior distributions
23.3 Dirichlet process mixtures
23.4 Beyond density estimation
23.5 Hierarchical dependence
23.6 Density regression
23.7 Bibliographic note
23.8 Exercises
Appendixes
A Standard probability distributions
A.1 Continuous distributions
A.2 Discrete distributions
A.3 Bibliographic note
B Outline of proofs of limit theorems
B.1 Bibliographic note
C Computation in R and Stan
C.1 Getting started with R and Stan
C.2 Fitting a hierarchical model in Stan
C.3 Direct simulation, Gibbs, and Metropolis in R
C.4 Programming Hamiltonian Monte Carlo in R
C.5 Further comments on computation
C.6 Bibliographic note
References
Author Index
Subject Index
Part I: Fundamentals of Bayesian Inference
1 Probability and inference
I.I The three steps of Bayesian data analysis
1.2 General notation for statistical inference
1.3 Bayesian inference
1.4 Discrete examples: genetics and spell checking
1.5 Probability as a measure of uncertainty
1.6 Example: probabilities from football point spreads
1.7 Example: calibration for record linkage
1.8 Some useful results from probability theory
1.9 Computation and software
I.I0 Bayesian inference in applied statistics
i.Ii Bibliographic note
1.12 Exercises
2 Single-parameter models
2.1 Estimating a probability from binomial data
2.2 Posterior as compromise between data and prior information
2.3 Summarizing posterior inference
2.4 Informative prior distributions
2.5 Normal distribution with known variance
2.6 Other standard single-parameter models
2.7 Example: informative prior distribution for cancer rates
2.8 Noninformative prior distributions
2.9 Weakly informative prior distributions
2.10 Bibliographic note
2.11 Exercises
3 Introduction to multiparameter models
3.1 Averaging over 'nuisance parameters'
3.2 Normal data with a noninformative prior distribution
3.3 Normal data with a conjugate prior distribution
3.4 Multinomial model for categorical data
3.5 Multivariate normal model with known variance
3.6 Multivariate normal with unknown mean and variance
3.7 Example: analysis of a bioassay experiment
3.8 Summary of elementary modeling and computation
3.9 Bibliographic note
3.10 Exercises
4 Asymptotics and connections to non-Bayesian approaches
4.1 Normal approximations to the posterior distribution
4.2 Large-sample theory
4.3 Counterexamples to the theorems
4.4 Frequency evaluations of Bayesian inferences
4.5 Bayesian interpretations of other statistical methods
4.6 Bibliographic note
4.7 Exercises
5 Hierarchical models
5.1 Constructing a parameterized prior distribution
5.2 Exchangeability and hierarchical models
5.3 Bayesian analysis of conjugate hierarchical models
5.4 Normal model with exchangeable parameters
5.5 Example: parallel experiments in eight schools
5.6 Hierarchical modeling applied to a meta-analysis
5.7 Weakly informative priors for variance parameters
5.8 Bibliographic note
5.9 Exercises
Part II: Fundamentals of Bayesian Data Analysis
6 Model checking
6.1 The place of model checking in applied Bayesian statistics
6.2 Do the inferences from the model make sense?
6.3 Posterior predictive checking
6.4 Graphical posterior predictive checks
6.5 Model checking for the educational testing example
6.6 Bibliographic note
6.7 Exercises
? Evaluating, comparing, and expanding models
7.1 Measures of predictive accuracy
7.2 Information criteria and cross-validation
7.3 Model comparison based on predictive performance
7.4 Model comparison using Bayes factors
7.5 Continuous model expansion
7.6 Implicit assumptions and model expansion: an example
7.7 Bibliographic note
7.8 Exercises
8 Modeling accounting for data collection
8.1 Bayesian inference requires a model for data collection
8.2 Data-collection models and ignorability
8.3 Sample surveys
8.4 Designed experiments
8.5 Sensitivity and the role of randomization
8.6 Observational studies
8.7 Censoring and truncation
8.8 Discussion
8.9 Bibliographic note
8.10 Exercises
9 Decision analysis
9.1 Bayesian decision theory in different contexts
9.2 Using regression predictions: survey incentives
9.3 Multistage decision making: medical screening
9.4 Hierarchical decision analysis for home radon
9.5 Personal vs. institutional decision analysis
9.6 Bibliographic note
9.7 Exercises
Part III: Advanced Computation
10 Introduction to Bayesian computation
10.1 Numerical integration
10.2 Distributional approximations
10.3 Direct simulation and rejection sampling
10.4 Importance sampling
10.5 How many simulation draws are needed?
10.6 Computing environments
10.7 Debugging Bayesian computing
10.8 Bibliographic note
10.9 Exercises
11 Basics of Markov chain simulation
11.1 Gibbs sampler
11.2 Metropolis and Metropolis-Hastings algorithms
11.3 Using Gibbs and Metropolis as building blocks
11.4 Inference and assessing convergence
11.5 Effective number of simulation draws
11.6 Example: hierarchical normal model
11.7 Bibliographic note
11.8 Exercises
12 Computationally efficient Markov chain simulation
12.1 Efficient Gibbs samplers
12.2 Efficient Metropolis jumping rules
12.3 Further extensions to Gibbs and Metropolis
12.4 Hamiltonian Monte Carlo
12.5 Hamiltonian Monte Carlo for a hierarchical model
12.6 Stan: developing a computing environment
12.7 Bibliographic note
12.8 Exercises
13 Modal and distributional approximations
13.1 Finding posterior modes
13.2 Boundary-avoiding priors for modal summaries
13.3 Normal and related mixture approximations
13.4 Finding marginal posterior modes using EM
13.5 Conditional and marginal posterior approximations
13.6 Example: hierarchical normal model (continued)
13.7 Variational inference
13.8 Expectation propagation
13.9 Other approximations
13.10 Unknown normalizing factors
13.11 Bibliographic note
13.12 Exercises
Part IV: Regression Models
14 Introduction to regression models
14.1 Conditional modeling
14.2 Bayesian analysis of classical regression
14.3 Regression for causal inference: incumbency and voting
14.4 Goals of regression analysis
14.5 Assembling the matrix of explanatory variables
14.6 Regularization and dimension reduction
14.7 Unequal variances and correlations
14.8 Including numerical prior information
14.9 Bibliographic note
14.10 Exercises
15 Hierarchical linear models
15.1 Regression coefficients exchangeable in batches
15.2 Example: forecasting U.S. presidential elections
15.3 Interpreting a normal prior distribution as extra data
15.4 Varying intercepts and slopes
15.5 Computation: batching and transformation
15.6 Analysis of variance and the batching of coefficients
15.7 Hierarchical models for batches of variance components
15.8 Bibliographic note
15.9 Exercises
16 Generalized linear models
16.1 Standard generalized linear model likelihoods
16.2 Working with generalized linear models
16.3 Weakly informative priors for logistic regression
16.4 Overdispersed Poisson regression for police stops
16.5 State-level opinons from national polls
16.6 Models for multivariate and multinomial responses
16.7 Loglinear models for multivariate discrete data
16.8 Bibliographic note
16.9 Exercises
17 Models for robust inference
17.1 Aspects of robustness
17.2 Overdispersed versions of standard models
17.3 Posterior inference and computation
17.4 Robust inference for the eight schools
17.5 Robust regression using t-distributed errors
17.6 Bibliographic note
17.7 Exercises
18 Models for missing data
18.1 Notation
18.2 Multiple imputation
18.3 Missing data in the multivariate normal and t models
18.4 Example: multiple imputation for a series of polls
18.5 Missing values with counted data
18.6 Example: an opinion poll in Slovenia
18.7 Bibliographic note
18.8 Exercises
Part V: Nonlinear and Nonparametric Models
19 Parametric nonlinear models
19.1 Example: serial dilution assay
19.2 Example: population toxicokinetics
19.3 Bibliographic note
19.4 Exercises
20 Basis function models
20.1 Splines and weighted sums of basis functions
20.2 Basis selection and shrinkage of coefficients
20.3 Non-normal models and regression surfaces
20.4 Bibliographic note
20.5 Exercises
21 Gaussian process models
21.1 Gaussian process regression
21.2 Example: birthdays and birthdates
21.3 Latent Gaussian process models
21.4 Functional data analysis
21.5 Density estimation and regression
21.6 Bibliographic note
21.7 Exercises
22 Finite mixture models
22.1 Setting up and interpreting mixture models
22.2 Example: reaction times and schizophrenia
22.3 Label switching and posterior computation
22.4 Unspecified number of mixture components
22.5 Mixture models for classification and regression
22.6 Bibliographic note
22.7 Exercises
23 Dirichlet process models
23.1 Bayesian histograms
23.2 Dirichlet process prior distributions
23.3 Dirichlet process mixtures
23.4 Beyond density estimation
23.5 Hierarchical dependence
23.6 Density regression
23.7 Bibliographic note
23.8 Exercises
Appendixes
A Standard probability distributions
A.1 Continuous distributions
A.2 Discrete distributions
A.3 Bibliographic note
B Outline of proofs of limit theorems
B.1 Bibliographic note
C Computation in R and Stan
C.1 Getting started with R and Stan
C.2 Fitting a hierarchical model in Stan
C.3 Direct simulation, Gibbs, and Metropolis in R
C.4 Programming Hamiltonian Monte Carlo in R
C.5 Further comments on computation
C.6 Bibliographic note
References
Author Index
Subject Index
展开全部
Bayesian data analysis 作者简介
Andrew Gelman是哥伦比亚大学统计学院的教授,应用统计学中心主任。他曾获得美国统计协会颁发的杰出统计应用奖、《美国政治科学评论》发表的最佳文章奖,以及统计学会主席理事会颁发的40岁以下人士杰出贡献奖。他的著作包括贝叶斯数据分析(与约翰·卡林、哈尔·斯特恩、大卫·邓森、阿基·维塔里和唐·鲁宾合著)、教学统计学等。
书友推荐
- >
名家带你读鲁迅:故事新编
名家带你读鲁迅:故事新编
¥13.0¥26.0 - >
名家带你读鲁迅:朝花夕拾
名家带你读鲁迅:朝花夕拾
¥10.5¥21.0 - >
随园食单
随园食单
¥15.4¥48.0 - >
罗庸西南联大授课录
罗庸西南联大授课录
¥13.8¥32.0 - >
烟与镜
烟与镜
¥15.4¥48.0 - >
伊索寓言-世界文学名著典藏-全译本
伊索寓言-世界文学名著典藏-全译本
¥9.3¥19.0 - >
巴金-再思录
巴金-再思录
¥14.7¥46.0 - >
月亮虎
月亮虎
¥19.7¥48.0
本类畅销
-
怎样解题
¥17.2¥29 -
自然哲学的数学原理-拟定经典力学世界图景的旷世巨典-全新修订本
¥39.4¥58 -
数学-应用与思考
¥16.1¥32.8 -
数学万花筒 修订版
¥32.4¥49 -
数学万花筒-夏尔摩斯探案集-3
¥30.1¥39 -
新型元启发式算法及其应用
¥77.4¥98