Syllabus PDF

Title Syllabus
Course Econometrics
Institution The University of Texas at Dallas
Pages 6
File Size 190.6 KB
File Type PDF
Total Downloads 107
Total Views 239

Summary

Course syllabus for Econometrics...


Description

ECON 4355 Econometrics Dr. Dong Li

Section 001

Office: GR 2.810

Spring 2016 Email: [email protected]

Tel: (972) 883-3517 Course Hours: TuTh 1:00-2:15

Office Hours: TuTh 11:15–12:00pm or by appointment

Course Web: http://www.utdallas.edu/elearning/

ECONLab: M-Th 3-7pm (GR 3.416)

Prerequisites Students must have taken (1) STAT 1342 or EPPS 2302 or EPPS 2303 or OPRE 3360; and (2) MATH 1326 or MATH 2414 or MATH 2419. Course Description This course is about the application of statistical methods to economic analysis. Particular attention is given to regression analysis and hypothesis testing. It is an introductory econometrics course. You will learn how to use statistical methods to analyze economic data, including empirically testing some economic theories and making predictions. After a brief review of probability and statistics, we will cover the simple linear regression model and hypothesis testing, and general linear regression models. You will also learn how to use dummy variables, and how to test and correct for heteroskedasticity and serial correlation. We will use the software Stata to do the data analysis. You are required to have access to Stata to finish your homework assignments and the course project. Emphasis is placed on empirical applications with a touch of econometric theory. I expect enthusiasm, curiosity, and eagerness from you. Student Learning Objectives/Outcomes After taking the course, students are expected to: (1) understand the necessary econometric software/tools to analyze economic data/models; (2) implement various econometric analyses and interpret the results; (3) communicate the results to peers and professionals. Required Textbook Principles of Econometrics (4th Ed), Hill et al., Wiley, 2011. Class notes will be posted on eLearning. Please check eLearning often for new notes, Stata handouts, homework assignments etc. Recommended Textbooks Introduction to Econometrics (4th Ed), Maddala & Lahiri, Wiley, 2009. Introductory Econometrics (4th Ed), Wooldridge, South-Western, 2011. Using Stata for Principles of Econometrics (5th Ed), Adkins et al., Wiley, 2015.

Topics and Academic Calendar Week 1 Intro, Calculus Review, Summation Review Week 2 Chapter 2 Week 3 Chapter 2 Week 4 Chapter 3 Week 5 Chapter 4 Week 6 Exam 1 Week 7 Chapter 5 Week 8 Chapter 5 Week 9 Chapter 6 Week 10 No class – UTD Spring Break Week 11 Chapter 7 Week 12 Exam 2 Week 13 Chapter 8 Week 14 Chapter 9 Week 15 Review Week 16 Exam 3 Depending on the progress in the classroom, the instructor may adjust the above schedule. Exams There will be three exams. All exams are comprehensive. Exams will consist of multiple choice, true/false questions, and short essay questions. All these exams will cover material from the textbook, instructor’s lectures, handouts, and homework. The dates of the exams will be announced when they are available. Generally there is no make-up exam. If you miss exam 1 or exam 2 with a properly documented excuse conforming to university policy, your exam 3 grade will be applied to the missed exam(s). If you miss exam 3 with a properly documented excuse that is consistent with university policy, you will be given a make-up exam at the discretion of the instructor. Always bring a calculator to the exams. A formula sheet identical to the one attached to this syllabus will be provided for all exams (it was copied from the inside covers of your textbook). Homework There will be homework assignments, including both theoretical and empirical questions, throughout the semester. The due date of each assignment will be announced in class. No late assignment will be accepted. The assignments must be well written (or typed) and complete. Any assignment not legible (to be determined by the instructor) or incomplete will receive partial or no credit for that assignment. You MUST turn in a hard-copy of your homework by the due date to receive credit. Emailed assignments will not be graded unless prior arrangement has been made with the instructor. You can discuss the homework assignments with other students. But you must write your own answers.

2

Project Every student will replicate results in a published paper. More details about the project will be announced in class. Grading Grades will be based on the homework assignments (15%), the project report (5%), and the three exams (25%, 25%, and 30%, respectively). Grading scales: 90-100=A, 80-89.9=B, 70-79.9=C, 60-69.9=D, Below 60=F! I may adjust these scales when appropriate. How You Should Study for This Course While handouts are provided through eLearning, it is imperative to take notes during classes to be able to understand the topics. While I do not check attendance, the opportunity cost of missing class(es) will be substantial. This is a 3 credit hour course. Each week we will spend 2.5 hours in class. You are supposed to spend at least twice as much time, i.e., at least 5 hours per week, on this course outside the classroom, including reading the textbook/handouts, doing homework, etc. How to Order Stata for Home Use To order Stata, go to the following web page: http://www.stata.com/order/gradplan-sites/ and click on “University of Texas, Dallas” and click on “View our student purchasing options.” (Very important; or you will pay a lot more as a faculty member.) I recommend at least Stata/IC - you can buy the six-month/twelve-month/perpetual license. Stata/Small, while cheaper, may not be able to handle all the observations in the homework/project. Older versions of Stata should be fine. You will use Stata a few times in the classroom throughout the semester. Make sure Stata is functioning on your laptop by the end of the first week. Most of the computers in the EPPS computer labs have Stata on them (but the labs are often used for teaching during the day). UT Dallas Syllabus Policies and Procedures The information contained in the following link constitutes the University’s policies and procedures segment of the course syllabus. Please go to http://go.utdallas.edu/syllabus-policies for these policies.

The descriptions and timeline contained in this syllabus are subject to change at the discretion of the instructor.

3

The Rules of Summation

Expectations, Variances & Covariances

n

å xi ¼ x1 þ x2 þ    þ xn

covðX; YÞ ¼ E½ðXE½XÞðY E½Y Þ ¼ å å ½ x  EðXÞ½ y  EðYÞ f ðx; yÞ

i¼1 n

å a ¼ na

x y

i¼1 n

covðX;Y Þ r ¼ p ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi var ðXÞvar ðY Þ

n

å axi ¼ a å xi

i¼1 n

i¼1

n

n

i¼1

i¼1

E(c1X þ c 2Y ) ¼ c1E(X ) þ c2E(Y ) E(X þ Y ) ¼ E(X ) þ E(Y )

å ðxi þ yi Þ ¼ å xi þ å yi

i¼1 n

n

n

var(aX þ bY þ cZ ) ¼ a2var(X) þ b2var(Y ) þ c2 var(Z ) þ 2abcov(X,Y ) þ 2accov(X,Z ) þ 2bccov(Y,Z )

å ðaxi þ byi Þ ¼ a å xi þ b å yi

i¼1 n

i¼1

i¼1

n

å ða þ bxi Þ ¼ na þ b å xi

If X, Y, and Z are independent, or uncorrelated, random variables, then the covariance terms are zero and:

i¼1

i¼1

n



å xi i¼1 n

¼

n

x1 þ x2 þ    þ xn n

varðaX þ bY þ cZÞ ¼ a2 varðXÞ

þ b2 var ðYÞ þ c2 var ðZ Þ

å ðxi  xÞ ¼ 0

i¼1 2

3

2

å å f ðxi ; yj Þ ¼ å ½ f ðxi ; y1 Þ þ f ðxi ; y2 Þ þ f ðxi ; y3 Þ

i¼1 j¼1

i¼1

¼ f ðx1 ; y1 Þ þ f ðx1 ; y2 Þ þ f ðx1 ; y3 Þ þ f ðx2 ; y1 Þ þ f ðx2 ; y2 Þ þ f ðx2 ; y3 Þ Expected Values & Variances EðXÞ ¼ x1 f ðx1 Þ þ x2 f ðx2 Þ þ    þ xn f ðxn Þ n

¼ å xi f ðxi Þ ¼ å x f ðxÞ i¼1

x

E½gðXÞ ¼ å gðxÞ f ðxÞ x

E ½g 1 ðXÞ þ g2 ðXÞ ¼ å ½ g1ðxÞ þ g2 ðxÞ f ðxÞ x

¼ å g1ðxÞ f ðxÞ þ å g2 ðxÞ f ðxÞ

Xm  Nð0; 1Þ s 2 If X  N(m, s ) and a is a constant, then  a  m P ðX  aÞ ¼ P Z  s If X  Nðm; s2 Þ and a and b are constants; then   am bm P ða  X  bÞ ¼ P Z s s If X  N(m, s 2), then Z ¼

Assumptions of the Simple Linear Regression Model SR1

x

x

¼ E½ g1 ðXÞ þ E½g 2 ðXÞ E(c) ¼ c E(cX ) ¼ cE(X ) E(a þ cX ) ¼ a þ cE(X ) var(X ) ¼ s2 ¼ E[X  E(X )] 2 ¼ E(X 2)  [E(X )]2 var(a þ cX ) ¼ E[(a þ cX )  E(a þ cX )]2 ¼ c 2var(X) Marginal and Conditional Distributions f ðxÞ ¼ å f ðx; yÞ for each value X can take y

f ðyÞ ¼ å f ðx; yÞ for each value Y can take x

f ðxjyÞ ¼ P½ X ¼ xjY ¼ y ¼

f ðx; yÞ f ðyÞ

If X and Y are independent random variables, then f (x,y) ¼ f (x)f ( y) for each and every pair of values x and y. The converse is also true. If X and Y are independent random variables, then the conditional probability density function of X given that Y ¼ y is f ðxjyÞ ¼

Normal Probabilities

f ðx; yÞ f ðxÞ f ðyÞ ¼ ¼ f ðxÞ f ðyÞ f ðyÞ

for each and every pair of values x and y. The converse is also true.

SR2 SR3 SR4 SR5 SR6

The value of y, for each value of x, is y ¼ b1 þ b 2x þ e The average value of the random error e is E(e) ¼ 0 since we assume that E( y) ¼ b1 þ b2x The variance of the random error e is var(e) ¼ s 2 ¼ var(y) The covariance between any pair of random errors, ei and e j is cov(ei , ej) ¼ cov( yi , yj ) ¼ 0 The variable x is not random and must take at least two different values. (optional) The values of e are normally distributed about their mean e  N(0, s2 )

Least Squares Estimation If b 1 and b2 are the least squares estimates, then ^ yi ¼ b1 þ b2 xi ^ei ¼ yi  ^yi ¼ yi  b1  b2 xi The Normal Equations Nb1 þ Sxi b2 ¼ Syi

Sxi b1 þ Sxi2b2 ¼ Sxi yi

Least Squares Estimators Sðxi  xÞðyi  yÞ S ðxi  xÞ2 b1 ¼ y  b2 x

b2 ¼

Elasticity percentage change in y Dy=y Dy x ¼ ¼  h¼ percentage change in x Dx=x Dx y h¼

x DEðyÞ=EðyÞ DEðyÞ x ¼ ¼ b2   EðyÞ Dx=x EðyÞ Dx

Least Squares Expressions Useful for Theory b2 ¼ b2 þ Swi ei wi ¼

xi  x

Sðxi  xÞ2

Swi ¼ 0;

Swi xi ¼ 1;

Sw2i ¼ 1=Sðxi  xÞ2

Properties of the Least Squares Estimators " # s2 Sxi2 varðb2 Þ ¼ varðb1 Þ ¼ s2 2 Sðxi  xÞ2 NSðxi  xÞ " # x covðb1 ; b2 Þ ¼ s2 Sðxi  xÞ2 Gauss-Markov Theorem: Under the assumptions SR1–SR5 of the linear regression model the estimators b1 and b2 have the smallest variance of all linear and unbiased estimators of b1 and b2. They are the Best Linear Unbiased Estimators (BLUE) of b1 and b2. If we make the normality assumption, assumption SR6, about the error term, then the least squares estimators are normally distributed. ! ! s2 s2 å xi2 ; b  N b ; b1  N b1 ; 2 2 Sðxi  xÞ2 NSðxi  xÞ2 Estimated Error Variance s ^2 ¼

S^ei2 N2

Estimator Standard Errors q ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi seðb1 Þ ¼ b varðb1 Þ; seðb2 Þ ¼ b varðb2 Þ

t-distribution

If assumptions SR1–SR6 of the simple linear regression model hold, then t¼

bk  bk  tðN2Þ ; k ¼ 1; 2 seðbk Þ

Interval Estimates

P [b2  tc se(b 2)  b2  b 2 þ tc se(b 2)] ¼ 1  a Hypothesis Testing Components of Hypothesis Tests 1. A null hypothesis, H0 2. An alternative hypothesis, H1 3. A test statistic 4. A rejection region 5. A conclusion If the null hypothesis H0 : b 2 ¼ c is true, then t¼

b2  c  tðN2Þ seðb2 Þ

Rejection rule for a two-tail test: If the value of the test statistic falls in the rejection region, either tail of the t-distribution, then we reject the null hypothesis and accept the alternative. Type I error: The null hypothesis is true and we decide to reject it. Type II error: The null hypothesis is false and we decide not to reject it. p-value rejection rule: When the p-value of a hypothesis test is smaller than the chosen value of a, then the test procedure leads to rejection of the null hypothesis. Prediction y0 ¼ b1 þ b2 x0 þ e0 ; ^y0 ¼ b1 þ b2 x0 ; f ¼ y^0  y0 " # q ffiffiffiffiffiffiffiffiffiffiffiffiffi 1 ðx0  xÞ2 2 b b ; seð f Þ ¼ varð fÞ varð f Þ ¼ s ^ 1þ þ 2 N Sðxi  xÞ

A (1  a)  100% confidence interval, or prediction interval, for y0 y0  tc seð f Þ ^

Goodness of Fit

e2i Sðyi  yÞ2 ¼ Sð^yi  yÞ2 þ S^ SST ¼ SSR þ SSE SSE SSR ¼ 1 ¼ ðcorrðy; ^yÞÞ2 R2 ¼ SST SST Log-Linear Model lnðyÞ ¼ b1 þ b2 x þ e; b lnð yÞ ¼ b1 þ b2 x

100  b2  % change in y given a one-unit change in x: yn ¼ expðb1 þ b2 xÞ ^

^ yc ¼ expðb1 þ b2 xÞexpð^ s2 =2Þ

Prediction interval: h i h i exp b lnðyÞ  tc seð f Þ ; exp b lnð yÞ þ tc seð f Þ

Generalized goodness-of-fit measure Rg2¼ ðcorrðy;^ yn ÞÞ2 Assumptions of the Multiple Regression Model yi ¼ b1 þ b2xi2 þ    þ bKxiK þ ei E(yi ) ¼ b1 þ b 2xi2 þ    þ bKxiK , E(e i) ¼ 0. var(yi) ¼ var(ei ) ¼ s2 cov(yi , yj) ¼ cov(e i, ej ) ¼ 0 The values of xik are not random and are not exact linear functions of the other explanatory variables. MR6 yi  N½ðb1 þ b2 xi2 þ    þ bK xiK Þ; s2  , ei  Nð0; s2 Þ

MR1 MR2 MR3 MR4 MR5

Least Squares Estimates in MR Model Least squares estimates b1 , b2 , . . . , bK minimize Sðb1, b 2, . . . , b KÞ ¼ åðy i  b1  b2 xi2      b KxiK Þ2 Estimated Error Variance and Estimator Standard Errors s ^2 ¼

å e^2i NK

seðbk Þ ¼

q ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi b varðbk Þ

Hypothesis Tests and Interval Estimates for Single Parameters

Regression with Stationary Time Series Variables

bk  bk  tðNKÞ seðbk Þ

Finite distributed lag model yt ¼ a þ b0 xt þ b1 xt1 þ b2 xt2 þ    þ bq xtq þ vt Correlogram rk ¼ å ðyt  yÞðytk  yÞ= å ðyt  yÞ2 pffiffiffiffi For H0 : rk ¼ 0; z ¼ T rk  Nð0; 1Þ



Use t-distribution

t-test for More than One Parameter H0 : b2 þ cb3 ¼ a b2 þ cb3  a  tðNKÞ seðb2 þ cb3 Þ q ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi 2b b b seðb2 þ cb3 Þ ¼ varðb 2 Þ þ c var ðb3 Þ þ 2c  covðb2 ; b3 Þ t¼

When H0 is true

Joint F-tests

LM test yt ¼ b1 þ b2 xt þ r^et1 þ ^vt

Test H 0 : r ¼ 0 with t-test

^ et ¼ g1 þ g2 xt þ r^et1 þ v^t Test using LM ¼ T  R2 AR(1) error yt ¼ b1 þ b2 xt þ et et ¼ ret1 þ vt

To test J joint hypotheses,

Nonlinear least squares estimation

ðSSER  SSEU Þ=J F¼ SSEU =ðN  K Þ To test the overall significance of the model the null and alternative hypotheses and F statistic are

ARDL(p, q) model yt ¼ d þ d0 xt þ dl xt1 þ    þ dq xtq þ ul yt1

H0 : b2 ¼ 0; b3 ¼ 0; : : : ; bK ¼ 0 H 1 : at least one of the b k is nonzero F¼

ðSST  SSEÞ=ðK  1Þ SSE=ðN  KÞ

RESET: A Specification Test yi ¼ b1 þ b2 xi2 þ b3 xi3 þ ei

yi ¼ b1 þ b2 xi2 þ b3 xi3 þ g1^yi2 þ ei ;

y^i ¼ b1 þ b2 xi2 þ b3 xi3 H0 : g1 ¼ 0

yi ¼ b1 þ b2 xi2 þ b3 xi3 þ g1^yi2 þ g2y^i3 þ ei ; Model Selection AIC ¼ ln(SSE=N) þ 2K=N SC ¼ ln(SSE=N) þ K ln(N)=N

Collinearity and Omitted Variables yi ¼ b1 þ b2 xi2 þ b3 xi3 þ ei s2 varðb2 Þ ¼ 2 ð1  r23Þ å ðxi2  x2 Þ2

Heteroskedasticity var(yi ) ¼ var(ei ) ¼ s2i General variance function

b covðx2 ; x3 Þ b varðx2 Þ

s2i ¼ expða1 þ a2 zi2 þ    þ aS ziS Þ Breusch-Pagan and White Tests for H0: a2 ¼ a3 ¼    ¼ a S ¼ 0 When H0 is true

x2 ¼ N  R2  x2ðS1Þ

2 Goldfeld-Quandt test for H0 : sM ¼ s2R versus H1 : sM2 6¼ sR2

When H0 is true F ¼ s ^ 2M =^ sR2  FðNM KM ;NR KR Þ Transformed model for varðei Þ ¼ si2 ¼ s2 xi pffiffiffiffi pffiffiffiffi pffiffiffiffi pffiffiffiffi yi = xi ¼ b1 ð1= xi Þ þ b2 ðxi = xi Þþ ei = xi

Estimating the variance function 2 lnð^ e2i Þ ¼ lnðsi Þ þ vi ¼ a1 þ a2 zi2 þ    þ aS ziS þ vi Grouped data varðei Þ ¼ s2i ¼

(

sM2 i ¼ 1; 2; . . . ; NM s2R i ¼ 1; 2; . . . ; NR

þ    þ up ytp þ vt AR(p) forecasting model yt ¼ d þ ul yt1 þ u2 yt2 þ    þ up ytp þ vt

Exponential smoothing y^t ¼ ayt1 þ ð1  aÞ^yt1 Multiplier analysis d0 þ d1 L þ d2 L2 þ    þ dq Lq ¼ ð1  u1 L  u2 L2      up Lp Þ  ðb0 þ b1 L þ b2 L2 þ   Þ

Unit Roots and Cointegration

H0 : g1 ¼ g2 ¼ 0

When x3 is omitted; biasðb2Þ ¼ Eðb2Þ  b2 ¼ b3

yt ¼ b1 ð1  rÞ þ b2 xt þ ryt1  b2 rxt1 þ vt

Transformed model for feasible generalized least squares  .p ffiffiffiffiffi  .p ffiffiffiffiffi . pffiffiffiffiffi .pffiffiffiffiffi yi s ^ i ¼ b1 1 s ^ i þ b 2 xi s ^ i þ ei s ^i

Unit Root Test for Stationarity: Null hypothesis: H0 : g ¼ 0 Dickey-Fuller Test 1 (no constant and no trend): Dyt ¼ gyt1 þ vt Dickey-Fuller Test 2 (with constant but no trend): Dyt ¼ a þ gyt1 þ vt Dickey-Fuller Test 3 (with constant and with trend): Dyt ¼ a þ gyt1 þ lt þ vt Augmented Dickey-Fuller Tests: m

Dyt ¼ a þ gyt1 þ å as Dyts þ vt s¼1

Test for cointegration D^ et ¼ g^et1 þ vt Random walk: yt ¼ yt1 þ vt Random walk with drift: yt ¼ a þ yt1 þ vt Random walk model with drift and time trend: yt ¼ a þ dt þ yt1 þ vt Panel Data Pooled least squares regression yit ¼ b1 þ b2 x2it þ b3 x3it þ eit Cluster robust standard errors cov(eit , eis) ¼ cts Fixed effects model yit ¼ b1i þ b2 x2it þ b3 x3it þ eit b1i not random yit  yi ¼ b2 ðx2it  x2i Þ þ b3 ðx3it  x3i Þ þ ðeit  ei Þ Random effects model yit ¼ b1i þ b2 x2it þ b3 x3it þ eit

bit ¼ b1 þ ui random

yit  ayi ¼ b1 ð1  aÞ þ b2 ðx2it  ax2i Þ þ b3 ðx3it  ax3i Þ þ vit qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi Ts2u þ se2 a ¼ 1  se

Hausman test

h i 1=2 b b t ¼ ðbFE;k  bRE;k Þ varðb FE;k Þ  varðb RE;k Þ...


Similar Free PDFs
Syllabus
  • 10 Pages
Syllabus
  • 17 Pages
Syllabus
  • 10 Pages
Syllabus
  • 7 Pages
Syllabus
  • 3 Pages
Syllabus
  • 11 Pages
Syllabus
  • 7 Pages
Syllabus
  • 6 Pages
Syllabus
  • 12 Pages
Syllabus
  • 4 Pages
Syllabus
  • 2 Pages
Syllabus
  • 4 Pages
Syllabus
  • 3 Pages
Syllabus
  • 2 Pages
Syllabus
  • 5 Pages