Title | Stat 135_Problem Set 1_ Fall 2020 UC Berkeley |
---|---|
Author | GG XX |
Course | Concepts Of Statistics |
Institution | University of California, Berkeley |
Pages | 2 |
File Size | 73.7 KB |
File Type | |
Total Downloads | 5 |
Total Views | 113 |
This is the Stat 135_Problem Set 1_ Fall 2020 UC Berkeley...
Stat 135 (LEC 001): Concepts of Statistics, UC Berkeley, Fall 2020
Problem Set 1 Instructor: Prof. Yun S. Song
Due: 10pm PDT (UTC -7) September 4, 2020 (on Gradescope)
Show all your work to receive full credit. 1. Let X ∼ Poisson(λ) and Y ∼ Poisson(µ) be independent random variables. (a) Let Z = X + Y . What is the distribution of Z ? (b) What is the conditional distribution of X given Z = m? What is the name of this distribution and what are its parameters? 2. Suppose X and Y have the following joint distribution 1 Y
1 12 1 6
2 3 4
0
X 2 1 6
0 1 3
3 1 12 1 6
0
(a) Show that X and Y are dependent. (b) Compute E[X], E[Y ], Var(X), Var(Y ) and the Pearson correlation ρX,Y . (c) Give the joint distribution of random variables U, V that have the same marginals as X, Y but are independent. 3. An urn initially contains b black balls and w white balls. In each trial, a ball is drawn at random from the urn and then it is returned to the urn together with d additional balls of the same color as the drawn ball. Let Xi denote the color of the ball in the ith trial. (a) Let X1 , . . . , Xn be a particular sequence of draws with exactly j white balls and n − j black balls. What is the probability of this sequence of draws? (b) In 100 trials, what is the probability that the 25th draw is a white ball and the 75th draw is a black ball? (c) In n trials, what is the expected number of times that a white ball is drawn? 4. If X ∼ Gamma(α, λ), where α > 0 and λ > 0, then the probability density function of X is given by ( α λ xα−1 e−λx , for x ≥ 0, fX (x) = Γ(α) 0, for x < 0. Recall that the Gamma( n2 , 12 ) distribution is called the χ2 distribution with n degrees of freedom, denoted χ2 (n). (a) For α, β, γ > 0, suppose X ∼ Gamma(α, λ) and Y ∼ Gamma(β, λ) are independent random variables. Show that X + Y ∼ Gamma(α + β, λ). 1
(b) Suppose X ∼ Normal(0, 1). Show that X 2 ∼ χ2 (1). (c) Suppose X1 , . . . , Xn are independent and identically distributed Normal(0, 1). Show that Pn 2 2 i=1 Xi ∼ χ (n).
5. Consider a random bit generator which in each trial produces a 1 with probability p or a 0 with probability (1 − p), independently of all other trials. Suppose you run this random bit generator P n times, and let Xi denote the ith outcome. Define Sn = ni=1 Xi . (a) Use Chebyshev’s inequality to put an upper bound on the probability P(| Snn − p| ≥
√2 ). n
(b) Find P(X1 = 0 | Sn = k), where 1 ≤ k ≤ n. (c) Find P(X1 = 0, Xn = 1 | Sn = k), where 1 ≤ k ≤ n. (d) Find the conditional correlation of X2 and Xn given that Sn = k, where 1 ≤ k ≤ n. 6. For 0 < p < 1 and k = 1, . . . , n, suppose Wk ∼ Geometric(p) are independent random variables (i.e., P(Wk = j) = (1 − p)j−1 p for j = 1, 2, . . .). Define Sn = W1 + W2 + · · · + Wn . (a) What is P(Sn = k), for k ≥ 0? Provide an exact formula. (b) What are E(Sn ) and Var(Sn )? (c) Use Markov’s inequality to put an upper bound on P(Sn ≥
4n p
(d) Use Chebyshev’s inequality to put an upper bound on P(Sn ≥ (e) For n very large, approximate the probability P(Sn ≥ Ra 2 Φ(a) = √12π −∞ e−x dx.
4n p ).
). 4n p
).
Provide your answer in terms of
2n (f) For every given ǫ > 0, what are the limits of P(Sn ≤ (1 − ǫ) 2n p ) and P(Sn ≤ (1 + ǫ) p ) as n → ∞? Justify your answer.
7. A hierarchical model or Bayesian network is a model where random variables are functions of other random variables. These models come up widely in probabilistic modeling. Here we explore the Gamma-Poisson model, Λ ∼ Gamma(α, β ) X|Λ ∼ Poisson(Λ), where the Gamma distribution is parametrized as described in Problem 4. (a) Find the marginal probability mass function (pmf) P[X = k] = pX (k). (b) Find E[X] and Var(X). (Hint: Theorems A and B of Section 4.4 might be helpful.) (c) Let α = r for some integer r ≥ 1 and β = 1−q q for 0 < q < 1. What is the marginal distribution of X in this case? (Hint: Γ(k) = (k − 1)! for any integer k.)
2...