HW1 sol - Georgia Tech\'s Professor Tuo Zhao - ISYE 2028/ISYE 3030 Homework 1 Solutions PDF

Title HW1 sol - Georgia Tech\'s Professor Tuo Zhao - ISYE 2028/ISYE 3030 Homework 1 Solutions
Course Basic Statistical Meth
Institution Georgia Institute of Technology
Pages 7
File Size 126.8 KB
File Type PDF
Total Downloads 42
Total Views 151

Summary

Georgia Tech's Professor Tuo Zhao - ISYE 2028/ISYE 3030
Homework 1 Solutions...


Description

3030 - Basic Statistical Methods, Fall 2019 Homework 1 - Basic Calculus and Probability 100 points total.

This homework is due Tuesday Sep. 2, 2019 on Canvas/Sep. 3, 2019 in class. • Please remember to staple if you turn in more than one page. • Please make sure to SHOW ALL WORK in order to receive full credit. • All logarithm functions stands for the NATURAL LOG, i.e. log(e) = 1.

1. X is a Poisson random variable with parameter λ. Then EX =? Hint: PMF: P (X = k) =

λk e−λ k! ,

k = 0, 1, 2, ... EX = λ

2. X is a Poisson random variable with parameter λ. Then EX 2 =? EX 2 = λ2 + λ

3. X is a uniform random variable over [a, b]. Then EX =? Hint: PDF: P (X = x) =

1 (b−a)

when x ∈ [a, b] and P (X = x) = 0 otherwise. EX =

a+b 2

4. X is a uniform random variable over [a, b]. Then EX 2 =? EX 2 =

a2 + b2 + ab 3

5. X is an exponential random variable with parameter λ. Then EX =? Hint: PDF: P (X = x) = λe−λx for x > 0 and P (X = x) = 0 otherwise. EX =

1 λ

Page 1 of 7

6. X is an exponential random variable with parameter λ. Then EX 2 =? EX 2 =

2 λ2

7. X is a Bernoulli random variable with parameter p. Then EX =? Hint: PMF: P (X = x) = px (1 − p)1−x for x = 0, 1. EX = p

8. X is a Bernoulli random variable with parameter p. Then EX 2 =? EX 2 = p

9. X is a Binomial random variable with parameter p and n. Then EX 2 =?   n x p (1 − p)n−x for x = 0, 1, ..., n. Hint: PMF: P (X = x) = k We only show the calculation of this one, the rest of the questions are fairly straightforward.   n X 2 n 2 x EX = px (1 − p)n−x x x=0   n X   n x = p (1 − p)n−x x(x − 1) + x x x=0     n n X X n x n x n−x = p (1 − p)n−x p (1 − p) + x x(x − 1) x x x=0

x=0 n X

n X n! n! x x(x − 1) px (1 − p)n−x px (1 − p)n−x + = x !(n − x )! x!(n − x)! x=1 x=2 n X

n X (n − 2)! (n − 1)! n px (1 − p)n−x + px (1 − p)n−x (x − 2)!(n − x)! (x − 1)!(n − x)! x=2 x=1   n  n  X X n − 1 x−1 n − 2 x−2 = n(n − 1)p2 p (1 − p)(n−1)−(x−1) p (1 − p)(n−2)−(x−2) + np x−1 x−2

=

n(n − 1)

x=1

x=2

2

= n(n − 1)p (1 + (1 − p))

n−2

+ np(1 + (1 − p))

n−1

= np(1 − p + np)

Page 2 of 7

10. Assume that we have m coins. We toss each one of them n times. The probability of heads showing up for each coin is p. What’s the probability of getting all n heads for at least one coin? Your answer should be in terms of m, n and p.

P [at least one coin has n heads] = 1 − P [no coin has n heads]  m = 1 − P [for one coin and n toss, not all heads]  m = 1 − 1 − P [for one coin and n toss, all heads] = 1 − (1 − pn )m

11. Let X1 , X2 ,...,Xn ∈ R be n samples drawn independently and identically from distribution, where EXi = µ and EXi2 = µ2 + σ 2 < ∞. By law of large number, what can we know about n

1X Xi n→∞ n lim

i=1

¯ n converges to µ in This is a direct result from the law of large numbers, which claims that X ¯ n = µ] = 1. probability, i.e. P [limn X 12. Is the following statement TRUE or FALSE: Cov(X, Y ) = 0 ⇒ X and Y are independent,

where Cov(X, Y ) = E(X − EX)(Y − EY ) denotes the covariance between two random variables X and Y .

The statement is FALSE, and I will give a counterexample below. Suppose we have two random variables X and Y , where P [X = 1] = P [X = −1] = 0.5; further

let Y = 0 when X = −1 and P [Y = 1] = P [Y = −1] = 0.5 when X = 1. Clearly the two random variables are dependent since knowing the value of Y will automatically give us the value of X . Now we show that they have 0 covariance. P By construction E[X] = E[Y ] = 0, and E[XY ] = (x,y )∈Ω xyP [X = x, Y = y] = 0 where Ω = {(−1, 0), (1, 1), (1, −1)}, then Cov[X, Y ] = E[XY ] − E[X ]E [Y ] = 0. n

13. Given ℓ(µ, σ) = −n log(σ) −

1 X n (xi − µ)2 , solve the following problems: log(2π) − 2 2σ 2 i=1

∂ℓ(µ, σ) = 0, solve for µ given σ fixed. ∂µ ∂ℓ(µ, σ) = 0, solve for σ given µ fixed. (b) ∂σ (a)

Page 3 of 7

(a) Pn n n 1 X ∂ℓ(µ, σ) 1 X i=1 xi − nµ =− 2 (xi − µ) = −2(xi − µ) = 2 ∂µ σ2 σ i=1 2σ i=1

By setting the partial derivative to 0, we get µ= (b) −n ∂ℓ(µ, σ) − = σ ∂σ

Pn

i=1 (xi

2

Pn

i=1 xi

n

− µ)2

1 −n − (−2) 3 = σ σ

Pn

i=1 (xi − σ3

µ)2

By setting the partial derivative to 0, we get σ2 =

n 1X (xi − µ)2 n i=1

14. Given ℓ(λ) = n log(λ) − λ

n X

xi , solve for λ given

i=1

∂ℓ(λ) = 0. ∂λ

For computing the partial derivative, n

∂ℓ(λ) n X xi = − λ i=1 ∂λ By setting the partial derivative to 0, we get n λ = Pn

i=1 xi

15. Given ℓ(λ) = −nλ + log(λ) the factorial of x.

n X i=1

xi −

n X

log(xi !), solve for λ given

i=1

∂ℓ(λ) = 0. Here x! stands for ∂λ

For computing the partial derivative, ∂ℓ(λ) = −n + ∂λ

Pn

i=1 xi

λ

By setting the partial derivative to 0, we get λ=

n 1X xi n i=1

Page 4 of 7

16. Given ℓ(p) = log(p)

n X i=1

xi + log(1 − p)

n X i=1

(1 − xi ), solve for p given

∂ℓ(p) = 0. ∂p

For computing the partial derivative, Pn Pn Pn P Pn ∂ℓ(p) (1 − p) ni=1 xi − p(n − i=1 xi − np (1 − xi ) xi ) i=1 xi i=1 = = i=1 − = p(1 − p) 1−p p(1 − p) p ∂p By setting the partial derivative to 0, we get p=

n 1 X xi n i=1

17. Find the PDF of a random variable given its CDF:   0, xb This is a uniform random variable, which has PDF f (x) =

1 1{a ≤ x ≤ b} b−a

18. Find the PDF of a random variable given its CDF:  1 − e−λx , x ≥ 0 F (x) =  0, x...


Similar Free PDFs