Title | Closed form solution for Ridge regression |
---|---|
Author | Shamraiz Rafeeq |
Course | Applied Statistics |
Institution | University of Essex |
Pages | 17 |
File Size | 858.2 KB |
File Type | |
Total Downloads | 20 |
Total Views | 135 |
Download Closed form solution for Ridge regression PDF
Closed form solution for Ridge regression Show that the Ridge optimization problem has the closed form solution
^βRidgeλ=(XTX+λI)−1XTy.β^λRidge=(XTX+λI)−1XTy. Hint: calculate the gradient of the loss function ℓRidge(β|y,X)=RSS(β) +λ∥β∥22ℓRidge(β|y,X)=RSS(β)+λ‖β‖22, set equal to zero and solve for ββ.
7.2 Ridge and Lasso for the orthonormal design Calculate the Ridge and the Lasso solution for the special case of an orthonormal design matrix.
7.3 Bayesian interpretation of Ridge regression 1. Write down the log-likelihood of the linear regression model. Note:
Yi=XTiβ+ϵi,Yi=XiTβ+ϵi, where ϵ1,…,ϵnϵ1,…,ϵn iid N(0,σ2)N(0,σ2) and XX is a fixed n×pn×p design matrix. 2. Find the expression for the maximum likelihood estimator. 3. Assuming a prior distribution β1,…,βpβ1,…,βp iid ∼N(0,τ2)∼N(0,τ2), derive the posterior distribution of ββ and show that the maximum a posteriori estimator (MAP) coincides with the Ridge estimator.
7.4 Prostate Cancer data Explore the prostate cancer data set described in the book Hastie, Tibshirani, and Friedman (2001). 4. Download the prostate cancer data set from the webpage
https://web.stanford.edu/~hastie/ElemStatLearn/datasets. 5. Run OLS, Best Subset selection (use the leaps package), Ridge Regression and Lasso Regression. 6. Print the coefficients as a table. 7. Calculate for each method the Test error and compare the results with Table 3.3. in Hastie, Tibshirani, and Friedman (2001). The solution to this exercise. Read the data set and create training and test data. dat...