STATS 500 Lecture 4: STATS 500 – Lecture 4

31 views3 pages
19 Oct 2016
School
Department
Course
Professor

Document Summary

Linear regression -estimation: the normal linear model. Assume the linear model y is the n x 1 random response vector (observed) X is the fixed n x p matrix of predictor variables (observed) Is the p x 1 vector of fixed and unknown parameters ( not observed. Is the n x 1 random vector of error terms (not observed) This is how it looks like in matrix form: linear versus nonlinear model. A linear model is linear in the parameters ( for example). A nonlinear model: brief overview of least squares estimation. Assuming x has rank p, the unique solution is: the fitted values and residuals. P=x (cid:894)x"x(cid:895)-1x is the li(cid:374)ear tra(cid:374)sfor(cid:373)atio(cid:374) represe(cid:374)ti(cid:374)g the orthogo(cid:374)al projectio(cid:374) of (cid:374)- dimensional euclidean space: properties of least squares estimates. The gauss-markov theorem states that is the best linear unbiased estimate (blue) of (i. e. it has minimum variance in the class of unbiased estimators)

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related textbook solutions

Related Documents

Related Questions