STAT312 Lecture : Least squares; eigenvalues and eigenvectors.pdf

36 views5 pages

Document Summary

Another application of the qr decomposition is in regression. X = q1r1, where q1 : has orthogonal columns (q01q1 = i ), and r1 : is upper triangular and non-singular (and x0x = r01r1). (x) , a. Apply gram-schmidt once again to independent columns of basis for which is the. I h, to obtain q2 : ( ) whose columns are orthogonal to each other (q02q2 = i. Then q = (q1 orthogonal columns and is square, hence is an or- thogonal matrix (qq0 = q0q = i ). Least squares estimation in terms of hat matrix decomposition of norm of residuals: note that y kx + yk2 = kxk2 + kyk2. 2 k(i h) yk2 with equality i hy = x i ( if and only if ) Y = x = hy and are orthogonal to the residuals e = y.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related textbook solutions