Statistics and Statistical Programming (Winter 2017)/Problem Set: Week 2
This is general advice going forward but it makes sense to include it here: My advice is to start working through the programming challenges first. The programming challenges will only include material that we covered in the readings for the previous week.
If you're having trouble loading up your dataset (PC2) find me in the next day or so as you will only be able to do the other challenges once you've done that one.
Programming Challenges[edit]
- PC1. Follow and then "Clone or download", my GitHub repository for the class assignments.
- PC2. Once you have it, find the RData file in the subdirectory
week_02
with your name associated with it. Load that file into R. It should load up one variable. Find that variable! - PC3. Once you've found the variable, compute and present a series of statistics on it that you should already be familiar with. Use functions to compute the mean, median, variance, standard deviation, and interquartile range?
- PC4. Although these basic functions all exist, many things you will want to do in the future won't have functions. Write R code to compute these three statistics by hand: mean, median, and mode. It's OK if getting the answer involves some eyeballing or counting this by hand. But do get the answer and be ready to walk us through how you did it.
- PC5. Create a number of visualizations of your dataset: at the very least, create a boxplot and histogram.
- PC6. Some of you will have negative numbers. Whoops! Those were not supposed to be there. Recode all negative numbers as missing (i.e. NA) in your dataset. Now create compute a new mean and standard deviation. How does it change? (Hint: the basic mean function will give you an error. You have use a named argument
na.rm=TRUE
to work around this.) - PC7. Log transform your dataset. Create new histograms, boxplots, and means, median, and standard deviations.
- PC8. Commit the code that does all of these into a folder called "week_02" in your git repository. Publish this on Github and email me with the link to your published Github folder.
Statistical Questions[edit]
Exercises from OpenIntro §2[edit]
- Q0. Any questions or clarifications from the OpenIntro text or lecture notes?
- Q1. Exercise 2.12 on kids missing school
- Q2. Exercise 2.20 on "assortative mating"
- Q3. Exercise 2.26 on twins (and conditional probability)
- Q4. Exercise 2.32 on the birthday problem (This is a super famous problem! Don't look it up!)
- Q5. Exercise 2.38 with the example of baggage fees
- Q6. Exercise 2.44 on income and gender
Empirical Paper[edit]
Let's take a look at this paper which is the second paper I published in graduate school (!):
- Buechley, Leah and Benjamin Mako Hill. 2010. “LilyPad in the Wild: How Hardware’s Long Tail Is Supporting New Engineering and Design Communities.” Pp. 199–207 in Proceedings of the 8th ACM Conference on Designing Interactive Systems. Aarhus, Denmark: ACM. [PDF available on my personal website]
At the very least read enough of the paper to get a sense for what it's about and to understand Table 2 but we the questions here are all basically going to be drawn from that table. It might be expedient to read the blog post that I wrote about this. Feel free to ignore all the stats or other stuff that's not relevant.
Looking at data from the US, answer a few questions we didn't answer in the question but that seem like they might be interesting (I has basically only finished taking a class like this one at the time I wrote this this paper!)
- Q7. Given that a US customer in the dataset has bought a LilyPad (either alone or in combination with a "normal" Arduino), what is the probability of that that they are female?
- Q8. Given that a US customer in the dataset is female, what is the probability that they bought a LilyPad (either alone or in combination with a "normal" Arduino)?
- Q9. In substantive terms, do these two numbers evidence of that the LilyPad design is successfully appealing to women? Ideally, you should be prepared to present at least one reason why and one one reason you might be skeptical.