Statistics and Statistical Programming (Winter 2017)/R lecture outline: Week 8: Difference between revisions

From CommunityData
No edit summary
No edit summary
Line 4: Line 4:
* discussing anova() better. The key thing I didn't talk about is about different types of "sum of squares" which are discussed in depth on [https://afni.nimh.nih.gov/sscc/gangc/SS.html this page hosted by the NIH]. By default, SPSS produces Type III standard errors although these [https://myowelt.blogspot.com/2008/05/obtaining-same-anova-results-in-r-as-in.html very frequently not what you want].
* discussing anova() better. The key thing I didn't talk about is about different types of "sum of squares" which are discussed in depth on [https://afni.nimh.nih.gov/sscc/gangc/SS.html this page hosted by the NIH]. By default, SPSS produces Type III standard errors although these [https://myowelt.blogspot.com/2008/05/obtaining-same-anova-results-in-r-as-in.html very frequently not what you want].
** Getting the same results in R is easy enough though: library(car); Anova(fit, type="III")
** Getting the same results in R is easy enough though: library(car); Anova(fit, type="III")
* logistic regression
** create a dummy variable: mpg > mean(mpg)
** glm(formula, family=binomial("logit"))

Revision as of 01:23, 20 February 2017

  • log() DV, log() IV
  • polynomial terms and interaction terms both with: I()
  • graphing residuals against fitted values (not just against different values of x)
  • discussing anova() better. The key thing I didn't talk about is about different types of "sum of squares" which are discussed in depth on this page hosted by the NIH. By default, SPSS produces Type III standard errors although these very frequently not what you want.
    • Getting the same results in R is easy enough though: library(car); Anova(fit, type="III")
  • logistic regression
    • create a dummy variable: mpg > mean(mpg)
    • glm(formula, family=binomial("logit"))