Hw2 solutions - hw 2 solu PDF

Title Hw2 solutions - hw 2 solu
Author Shuang Song
Course Probability Theory
Institution Yale University
Pages 4
File Size 114.4 KB
File Type PDF
Total Downloads 241
Total Views 614

Summary

Fall 2018 S&DS 241: Probability Theory with Applications Homework 2 Due: Sep 9, 2018 in class Prof. Yihong WuSolution prepared by Dylan O’Connell. Blitzstein-Hwang, Chapter 2, Problem 11. Anexit pollin an election is a survey taken of voters just after they have voted. One major use of exit ...


Description

Fall 2018 S&DS 241: Probability Theory with Applications Homework 2 Due: Sep 9, 2018 in class Prof. Yihong Wu Solution prepared by Dylan O’Connell. 1. Blitzstein-Hwang, Chapter 2, Problem 11. 11.

An exit poll in an election is a survey taken of voters just after they have voted. One major use of exit polls has been so that news organizations can try to figure out as soon as possible who won the election, before the votes are officially counted. This has been notoriously inaccurate in various elections, sometimes because of selection bias: the sample of people who are invited to and agree to participate in the survey may not be similar enough to the overall population of voters. Consider an election with two candidates, Candidate A and Candidate B. Every voter is invited to participate in an exit poll, where they are asked whom they voted for; some accept and some refuse. For a randomly selected voter, let A be the event that they voted for A, and W be the event that they are willing to participate in the exit poll. Suppose that P (W |A) = 0.7 but P (W |Ac ) = 0.3. In the exit poll, 60% of the respondents say they voted for A (assume that they are all honest), suggesting a comfortable victory for A. Find P (A), the true proportion of people who voted for A. Solution: We have P (A|W ) = 0.6 since 60% of the respondents voted for A. Let p = P (A). Then 0.6 = P (A|W ) =

0.7p P (W |A)P (A) = . P (W |A)P (A) + P (W |Ac )P (Ac ) 0.7p + 0.3(1 − p)

Solving for p, we obtain 9 ≈ 0.391. 23 So actually A received fewer than half of the votes! P (A) =

1

2. Blitzstein-Hwang, Chapter 2, Problem 36 36.

(a) Suppose that in the population of college applicants, being good at baseball is independent of having a good math score on a certain standardized test (with respect to some measure of “good”). A certain college has a simple admissions procedure: admit an applicant if and only if the applicant is good at baseball or has a good math score on the test. Give an intuitive explanation of why it makes sense that among students that the college admits, having a good math score is negatively associated with being good at baseball, i.e., conditioning on having a good math score decreases the chance of being good at baseball. (b) Show that if A and B are independent and C = A∪B, then A and B are conditionally dependent given C (as long as P (A ∩ B) > 0 and P (A ∪ B) < 1), with P (A|B, C) < P (A|C). This phenomenon is known as Berkson’s paradox, especially in the context of admissions to a school, hospital, etc. Solution: (a) Even though baseball skill and the math score are independent in the general population of applicants, it makes sense that they will become dependent (with a negative association) when restricting only to the students who are admitted. This is because within this sub-population, having a bad math score implies being good at baseball (else the student would not have been admitted). So having a good math score decreases the chance of being good in baseball (as shown in Exercise 16, if an event B is evidence in favor of an event A, then B c is evidence against A). As another explanation, note that 3 types of students are admitted: (i) good math score, good at baseball; (ii) good math score, bad at baseball; (iii) bad math score, good at baseball. Conditioning on having good math score removes students of type (iii) from consideration, which decreases the proportion of students who are good at baseball. (b) Assume that A, B, C are as described. Then P (A|B ∩ C) = P (A|B) = P (A), since A and B are independent and B \ C = B. In contrast, P (A|C) =

P (A \ C) P (A) > P (A), = P (C) P (C)

since 0 < P (C) < 1. Therefore, P (A|B, C ) = P (A) < P (A|C).

3. Proof: P (AC ∩ (B ∪ C C )) = P ((AC ∩ B) ∪ (AC ∩ C C )). Distributing the intersection = P (AC ∩ B) + P (AC ∩ C C ) − P (AC BC C ). By the inclusion exclusion principle. = P (AC )P (B) + P (AC )P (C C ) − P (AC )P (B)P (C C ). By mutual independence (if A and B are independent, AC and B are, because AC uniquely determines A). = P (AC )(P (B) + P (C C ) − P (B C )P (C C )) = P (AC )P (B ∪ C C ) By inclusion exclusion principle. 4. (a) Agree. If P (A) = 1, we have P (AC ) = 0. P (B ∩ AC ) ≤ P (AC ) = 0. P (A ∩ B ) = P (B) − P (B ∩ AC ) = P (B) = P (B )P (A). (b)Disagree. Counterexample: Toss a coin. 2

A = Ω, B = {Head }. P (A|B) = 1, P (B|A) =

1 2

(c) Agree. If P (A|B) = 0, we have P (AB) = 0. P (AB) P (B|A) = P (A) = 0 (d) Agree. If P (A|B) > P (A), and P (A|B C ) ≥ P (A), P (A) = P (A|B)P (B) + P (A|B C )P (B C ) > P (A)P (B) + P (A)P (B C ) > P (A), which is impossible. 5. (a) There are two ways the match can end in three straight sets: LLL and WWW, where W means win and L means loss. By the law of total probability,

P (LLL) = P (LLL|stronger)P (stronger) + P (LLL|weaker)P (weaker)+ P (LLL|equal)P (equal) = (1/4)3 (1/3) + (3/4)3 (1/3) + (1/2)3 (1/3) = 3/16,

P (W W W ) = P (W W W |stronger)P (stronger) + P (W W W |weaker)P (weaker)+ P (W W W |equal)P (equal) = (3/4)3 (1/3) + (1/4)3 (1/3) + (1/2)3 (1/3) = 3/16. Thus, P (Match ends in 3 sets) = P (LLL) + P (W W W ) = 3/8. (b) By symmetry, P (Rafa loses the match) + P (Rafa win the match) = 2P (Rafa loses the match) = 1. It follows that P (Rafa loses the match) = 1/2. (c) There are three ways Rafa can lose by 1-3: WLLL, LWLL, and LLWL. Following a similar argument as in (a), P (WLLL) = 23/384. We note that P (W LLL) = P (LW LL) = P (LLW L), thus P (Rafa loses by 1-3) = 23/128. (d)

3

By Bayes’ rule, P (Rafa is stronger|Rafa loses by 1-3) P (Rafa is stronger) = P (Rafa loses by 1-3|Rafa is stronger) P (Rafa loses by 1-3) 1/3

= (9/256) 23/128 = 3/46. 6. (a) Let Ai denote the event that the ith drawn ball is white (i = 1, 2) and B denote the event that the coin lands on heads. By the law of total probability P (A1 ∩ A2 ) = P (A1 ∩ A2 |B)P (B ) + P (A1 ∩ A2 |B c )P (B c ). Given B (i.e. we sample from the first urn), the probability of drawing two white balls is (2/5)2 . Given B c (i.e. we sample from the second urn), the probability of drawing two white balls is (3/5)2 . Also, P (B) = P (B c ) = 1/2. Thus, P (A1 ∩ A2 ) = P (A1 ∩ A2 |B)P (B) + P (A1 ∩ A2 |B c )P (B c ) = (1/2)((2/5)2 + (3/5)2 ). In a similar manner, P (A1 ) = P (A1 |B)P (B) + P (A1 |B c )P (B c ) = (1/2)(3/5 + 2/5). By Bayes’ rule, P (A2 |A1 ) = P (A1 ∩ A2 )/P (A1 ) =

(1/2)((2/5)2 +(3/5)2 ) (1/2)(3/5+2/5)

= 13/25.

(b) There is an equal chance of picking each urn and the balls are selected with replacement. Thus, P (2nd ball white) = (1/2) · (3/5) + (1/2) · (2/5) = 1/2. (c) The events in (a) and (b) are not independent since knowledge that the first ball was white implies that the second ball is more likely to have come from the urn with three white balls. To aid your intuition, you can consider an exaggerated scenario where one urn has 999 white balls and 1 black ball, and the other has 1 white ball and 999 black balls. In that case, if we pick an urn at random and draw a white ball from it, we intuitively would think it is quite unlikely we happened to pick from the urn with only 1 white ball. (d) (1/2)(3/5)n +(1/2)(2/5)n P (n − th ball white|first n − 1 balls white) = (1/2)(3/5)n−1 +(1/2)(2/5)n−1 . For large n, (2/5)n is much smaller than (3/5)n and hence the limit is 3/5. If the first n − 1 balls are white, it is likely that the next ball will come from the urn with three white balls (for which the probability of selecting a white ball is 3/5).

4...


Similar Free PDFs