Slides from my intro to Bayesian regression talk
Back in April, I gave a guest lecture on Bayesian regression for the psychology departmentβs graduate statistics class. This is the same course where I first learned regressionβand where I first started using R for statistics instead of for data cleaning. It was fun drawing on my experience in that course and tailoring the materials for the level of training.
Here are the materials:
Observations (training data)
As I did with my last Bayes talk, Iβm going to note some questions from the audience, so I donβt forget what kinds of questions people have when they are introduced to Bayesian statistics.
One theme was frequentist baggage π. One person asked about Type I and Type II error rates. I did not have a satisfactory (that is, rehearsed) answer ready for this question. I think I said something about how those terms are based on a frequentist, repeatedsampling paradigm, whereas a Bayesian approach worries about different sorts of errors. (Statistical power is still important, of course, for both approaches.) Next time, I should study up on the frequentist properties of Bayesian models, so I can field these questions better.
Other questions:
 Another bit of frequentist baggage π. I mentioned that with a posterior predictive distribution, we can put an uncertainty interval on any statistic we can calculate, and this point brought up the question of multiple comparisons. These are a bad thing in classical statistics. But for Bayes, there is only one model, and the multiple comparisons are really only the implications of one model.
 Someone else said that they had heard that Bayesian models can provide evidence for a null effectβhow does that work? I briefly described the ROPE approach, ignoring the existence of Bayes factors entirely.
For future iterations of this tutorial, I should have a worked example, maybe a blog post, on each of these issues.
Itβs kind of amusing now that I think about it. A big part of my enthusiasm for Bayesian statistics is that I find it much more intuitive than frequentist statistics. Yes! I thought to myself. I never have to worry about what the hell a confidence interval is ever again! Well, actuallyβno. I need to know this stuff even more thoroughly than ever if I am going to talk fluently about what makes Bayes different. Β―\_(γ)_/Β―
Last knitted on 20220527. Source code on GitHub.^{1}

.session_info #> β Session info βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ #> setting value #> version R version 4.2.0 (20220422 ucrt) #> os Windows 10 x64 (build 22000) #> system x86_64, mingw32 #> ui RTerm #> language (EN) #> collate English_United States.utf8 #> ctype English_United States.utf8 #> tz America/Chicago #> date 20220527 #> pandoc NA #> #> β Packages βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ #> package * version date (UTC) lib source #> assertthat 0.2.1 20190321 [1] CRAN (R 4.2.0) #> cli 3.3.0 20220425 [1] CRAN (R 4.2.0) #> crayon 1.5.1 20220326 [1] CRAN (R 4.2.0) #> emo 0.0.0.9000 20220525 [1] Github (hadley/emo@3f03b11) #> evaluate 0.15 20220218 [1] CRAN (R 4.2.0) #> generics 0.1.2 20220131 [1] CRAN (R 4.2.0) #> git2r 0.30.1 20220316 [1] CRAN (R 4.2.0) #> glue 1.6.2 20220224 [1] CRAN (R 4.2.0) #> here 1.0.1 20201213 [1] CRAN (R 4.2.0) #> knitr * 1.39 20220426 [1] CRAN (R 4.2.0) #> lubridate 1.8.0 20211007 [1] CRAN (R 4.2.0) #> magrittr 2.0.3 20220330 [1] CRAN (R 4.2.0) #> purrr 0.3.4 20200417 [1] CRAN (R 4.2.0) #> ragg 1.2.2 20220221 [1] CRAN (R 4.2.0) #> rlang 1.0.2 20220304 [1] CRAN (R 4.2.0) #> rprojroot 2.0.3 20220402 [1] CRAN (R 4.2.0) #> rstudioapi 0.13 20201112 [1] CRAN (R 4.2.0) #> sessioninfo 1.2.2 20211206 [1] CRAN (R 4.2.0) #> stringi 1.7.6 20211129 [1] CRAN (R 4.2.0) #> stringr 1.4.0 20190210 [1] CRAN (R 4.2.0) #> systemfonts 1.0.4 20220211 [1] CRAN (R 4.2.0) #> textshaping 0.3.6 20211013 [1] CRAN (R 4.2.0) #> xfun 0.31 20220510 [1] CRAN (R 4.2.0) #> #> [1] C:/Users/Tristan/AppData/Local/R/winlibrary/4.2 #> [2] C:/Program Files/R/R4.2.0/library #> #> ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Leave a comment