Week 1 HW and Presentation Questions

Week 1 HW:

  1. McElreath’s HW 1
  2. Rethinking 2M3: Suppose there are two globes, one for Earth and one for Mars. The Earth globe is 70% covered in water. The Mars globe is 100% land. Further suppose that one of these globes—you don’t know which—was tossed in the air and produced a “land” observation. Assume that each globe was equally likely to be tossed. Show that the posterior probability that the globe was the Earth, conditional on seeing “land” (Pr(Earth|land)), is 0.23.

Week 2 Presentations:

  1. HW1 ^^^

  2. Chapter 2: 2M4-2M7

  3. Conditional Probability and Monte Carlo
    a. Hiroshi has two children. If you know one of them was a girl, what’s the probability that the other child is a girl? (Counting & Bayes Rule)
    b. Prove this with a monte carlo experiment.
    c. Tien has two children. You know that one of them is a daughter, who was born on a Thursday. Simulate this problem to identify the probability that the other child is also a daughter. Make sure to generate a sex, day pair and not just assume independence.
    d. Prove the Monte Hall problem via Monte Carlo simulations.

  4. Simulate a queuing problem, as adapted from Gelman. There are 3 doctors at a clinc, new customers arrive Poisson distributed with lambda ~ 10 minutes. Each person takes anywhere from 5-20 minutes (uniformly distributed) occupying the doctor. The clinic opens at 9am and last patient is allowed in at 4pm.
    a. Simulate this process for a 100 days and identify patient average wait times.
    b. Simulate this problem as multi queue, i.e. each patient chooses a doctor to see when they arrive at the clinic.
    c. Simulate where the time to solve problems is heavy tailed (I suggest using a shifted log normal)

  5. Monte Carlo, convergence and accuracy
    a. Calculate pi using Buffon’s Needle using Monte Carlo. Use 4 different techniques to model how a human would drop the needles and report on any bias estimates.
    b. Calculate pi using the cannonball method. Report on the empirical error as a function of number of samples and explain your hypothesis.

  6. (Optional: open-ended) We’ve learned that frequentist methods have an implicit uniform prior. Show that a uniform prior is not always a good prior when doing inference.

I sent this out via email, but adding this here for posterity

The 5 Presentations have been assigned to the groups and team leaders will assign 2 members from their team to do the presentation. The presentations should be structured as a full tutorial, you’ll effectively be teaching the class and getting questions from the class (and me). This shouldn’t take long, but don’t half ass it either

  1. Approximately Normal - Oleks Klymenko
  2. Bayesian Anonymous - Arpita Singh
  3. Conditional Experience - Denny Wan
  4. Contagion - Chris Andronikos
  5. Ganymede - Wilfred Gee

Additionally, the HW questions have solutions which you should use when checking your approach. Please share the answers with each other for feedback, as these questions somewhat open ended.

Hi Varun,

Group 2:

Myself and Eleana will be presenting in class tomorrow for our group. Just to confirm, questions 2M4-2M7 from the book and will be presenting using R?

See you tommorow!

1 Like

Yep, pretty much. It’s just conditional probability, so I don’t think you’ll need any R, just a bit of maths!

Hi All,

For groups doing presentations, you can bring your own laptop and plug it in to present. MS has a bajillion connectors, it’s not too tricky.If you want, you can use my laptop too. I’ll set up RStudio and Jupyter notebooks for your use.

I’d suggest you post a link to your slides (if you have any), and/or put your code somewhere public - so you can share your presentation materials to this thread so everyone else has access to them too.

@arpita.singh @wtgee @candronikos

Could I get you to post your Presentations + Code here. I’ll post my own code too (once it’s been cleaned up)

Hi All,

See attached presentation from WK1 questions 2M4-2M7.

Cheers,
Arpita

2 Likes

Hi,

Here are my solutions for this weeks HW: https://github.com/nayyarv/dsairethinking/tree/master/week1

Feel free to clone the repo since I’ll be pushing all code and collating all the presentations there for the remainder of the course (so a simple git pull should keep you up to date).

Additionally, I’ll be accepting PRs if you want to improve things!

1 Like

Hi Varun,

is there a recommended reading section from the textbook anywhere or would you recommend that we try to follow the chapters as the roughly follow the video version of the course?

I’d recommend following the chapters as they match the videos. The textbook is effectively the course notes, they’re very tightly linked to the lectures as opposed to being a referential source.

I’ve put the chapters on the main thread: Welcome to Bayesian Rethinking, this week is chapter 4.