Title | Math - Maths requirements for the course |
---|---|
Course | Microeconomics Analysis 1 |
Institution | University of Sydney |
Pages | 9 |
File Size | 213.1 KB |
File Type | |
Total Downloads | 54 |
Total Views | 142 |
Maths requirements for the course...
ECON6001: Prerequisite Math 1
Derivatives
What is the derivative of the following functions? 1. y = x5 ln(4x + 5) 2. y = 3ex − 4x 2
3. p = q 3 − ln q 4. u = (3y − 5)3 5. u =
2
2x−1 (x+3)2
Optimization 1. Determine whether the following functions are concave or convex. (a) y =
√ x + 5 if x ≥ 0
(b) u = 2x2 − 5x (c) q = ln t − t2 if t > 0 (d) p = (10 − q)q
(e) y = x1.5 if x > 0 2. Sketch a graph of the following functions. You should be able to determine on which intervals the function is positive vs. negative, increasing vs. decreasing, concave (concave downwards) vs. convex (concave upwards). (a) y = x4 − 4x3 (b) y = x3 − 2x2 + x − 2 (c) y = −2x + 4 (d) y =
5x 10+3x
(e) y = x2 −
1 x
1
3. Find maximum and minimum of the following functions: (a) y = −x2 + 10x − 6 (b) u = ln t − t2 if t > 0 (c) y = −2w + 4 if w ∈ [−2, 6] (d) q = p2 − 5p + 2
3
Systems of Equations
Solve the following systems of equations: 2x − 25y = 17 1. 15y − x = −6 2.
3.
10x +
4.
4
y−6 x−1 = y+2 x+15 x−3 = y−4 y−1 x 3 1−y
=4
1 + 2y = 2 x x + 2y = 5 y − 3z = 5 3x − z = 4
x2 + 4y2 = 10 5. x + 6y − 10 = 0
Partial Derivatives
What are the partial derivatives of the following functions? 1. f (x, y) = ln(x + y2 ) 2. q(p1 , p2 ) =
p1 +5 (p2 −1)5
3. u(x1 , x2 , x3 ) =
1 √ x1 (xδ2 + 2x3δ ) δ 1
4. v(x, y, z) = x 6 y− 2 z 4 1
2
5
Constraint Optimization
Find maxima and minima of the following functions subject to the specified constraints. 1. f (x, y) = x2 − y2 subject to x2 + y2 = 1 2. f (x, y) = x + y subject to xy = 16 √ √ 3. u(x, y) = x y subject to px + qy = 20, where p and q are parameters √ √ 4. u(x, y) = x + y subject to wz = 100 and x + y + w + z = 100
6
Basic Linear Algebra 1. What is the rank of the following matrices?
2 2 3 1 A= ,C = ,B = , D = 1 1 2 1 0 1 0 2 0 "
# 2 3
"
# 2 4
"
#
3 4 6
3
0
2 , E = 1 2 2
1
2
1 1 2
2. Compute " 2 (a) 1 h (b) x
the following matrix products: # " # 3 2 × 1 0 # " # " i 2 3 x × y × y 1 0 # " # # " " 3 2 2 y 3 (c) × × 1 1 x 1 1
7
Basic Probability
Find expected value of the following random variables: 1. X = {3, 9, 10} with the corresponding probabilities p = (0.5, 0.3, 0.2) 2. X = {−2, 1, 2, 5} with the corresponding probabilities p = (0.6, 0.1, 0.2, 0.1) 3. X is continuous on [−2, 2] with density f (x) = 0.25 4. X is continuous on [0, 5] with density f (x) =
3
2x 25
3
Answers and Resources Derivatives 1. 5x4 ln(4x + 5) +
x5 4x+5
2. 3ex − 4 3.
2 − 31 q 3
− q −1
4. 9(3y − 5)2 5.
2(3x+2) (x+3)3
Resources: Youtube Videos: Derivatives... How? (NancyPi) The Chain Rule... How? When? (NancyPi) Reading: Math is fun: Derivative rules Paul’s Online Notes. Derivatives
Optimization 1. Concave vs. convex (a) y′ = 0.5x−0.5 and y′′ = −0.25x−1.5 ≤ 0 if x ≥ 0, hence, it is concave. (b) u′ = 4x − 5 and u′′ = 4 > 0, hence, it is convex. (c) q ′ =
1 t
− 2t and q ′′ = − t12 − 2 < 0, hence, it is concave.
(d) p′ = 10 − 2q and p′′ = −2 < 0, hence, it is concave. (e) y′ = 1.5x0.5 and y′′ = 0.75x−0.5 > 0 if x > 0, hence, it is convex. 2. Graphs
4
(a)
(b)
(c)
(d)
5
(e) 3. Maximum and minimum (a) max: x = 5, y = 19; min: x = ±∞, y = −∞ (b) max: t = √12 , u = −0.5(ln 2 + 1); min: t = +∞, u = −∞ (c) max: w = −2, y = 8; min: w = 6, y = −8 (d) max: p = ±∞, q = +∞; min: p = 2.5, q = −4.25 Resources: Youtube Videos: Concavity introduction | Using derivatives to analyze functions | Khan Academy Analyzing concavity (algebraic) | Khan Academy Inflection Points and Concavity Intuition | Khan Academy Graphing using derivatives | Khan Academy Analyzing a function with its derivative | Khan Academy Critical points introduction | Khan Academy Reading: Sangaku Maths: Derivative applications
Systems of Equations 1. x = 21, y = 1 2. x = 9, y = 10 3. x = 0.25, y = −1 4. x = 1, y = 2, z = −1 5. x = 1, y = 1.5 6
Resources: Youtube Videos: Solving Systems of Equations... Substitution Method (NancyPi) Solving Systems of Equations... Elimination Method (NancyPi) Discriminant of quadratic equations | Khan Academy How to Solve Quadratic Equations by Factoring (NancyPi) Systems of nonlinear equations 3 | Khan Academy How to Solve By Completing the Square (NancyPi) Reading: Math is fun: Systems of Linear Equations Math is fun: Quadratic Equations Chili Math: Solving Systems of Nonlinear Equations
Partial Derivatives 1 x+y 2
1. fx′ (x, y) = 2. q1′ =
1 (p2 −1)5
+5 and q2′ = −5 (pp21−1) 6 1
3. u1′ = 0.5x1−0.5 (xδ2 +2xδ3 ) δ ; u2′ = 1 2x3δ) δ −1 x3δ−1 5
2y 1+y 2
and fy′ (x, y) =
1
1 √ √ δ−1 x1 (xδ2 +2x3δ) δ −1 x2 and u′3 = 2 x1 (xδ2 +
1
3
1
1
4. vx = 16 x−6 y− 2 z 4 ; vy = −0.5x 6 y− 2 z 4 and vz = 4x 6 y− 2 z 3 Resources: Youtube Videos: Partial derivatives - How to solve? (Krista King) Reading: Paul’s Online Notes. Partial Derivatives
Constraint Optimization 1. min f = −1 at (x = 0, y = ±1); max f = 1 at (x = ±1, y = 0) 2. min f = −8 at (x = −4, y = −4); max f = 8 at (x = 4, y = 4) 3. min f = 0 at (x = 0, y = , y = 10 (x = 10 ) p q
20 ) q
and (x =
7
20 ,y p
= 0); max f =
√10 pq
at
√ √ 4. min f = 4 10 at (x = 40, y = 40, w = 10 , z = 10); max f = 4 15 at (x = 60, y = 60, w = −10, z = −10) Resources: Youtube Videos: Constrained optimization introduction | Khan Academy Lagrange multipliers, using tangency to solve constrained optimization | Khan Academy Finishing the intro lagrange multiplier example | Khan Academy Lagrange multiplier example, part 1 | Khan Academy Lagrange multiplier example, part 2 | Khan Academy The Lagrangian | Khan Academy Reading: Berkeley math: Lagrange multipliers and constrained optimization math.vt: Constraint optimization: the method of Lagrange multipliers
Basic Linear Algebra 1. rk (A) = 2, rk(B) = 1, rk(C) = 2, rk(D) = 3, rk(E) = 2. " # 7 2. (a) 2 (b) 2x2 + 4xy # " 2x + 6y + 13 (c) x + 2y + 5 Resources: Youtube Videos: Introduction to matrices | Khan Academy Matrix multiplication (part 1) | Khan Academy Matrix multiplication (part 2) | Khan Academy Linear Independence and Linear Dependence, Ex 1 (patrickJMT) Dimension of the column space or rank | Khan Academy Reading: Basic Linear Algebra for Deep Learning math.fsu: Rank of a Matrix. Linear Independence. Vector Space 8
Basic Probability 1. 6.2 2. −0.2 3. 0 4.
10 3
Resources: Youtube Videos: Expected Value and Variance of Discrete Random Variables (jbstatistics) Deriving the Mean and Variance of a Continuous Probability Distribution (jbstatistics) Reading: Math is fun: Random Variables: Mean, Variance and Standard Deviation Probability course: Expectation of discrete random variable Probability course: Expectation of continuous random variable
9...