STATS 500 Lecture 4: STATS 500 – Lecture 4
Document Summary
Linear regression -estimation: the normal linear model. Assume the linear model y is the n x 1 random response vector (observed) X is the fixed n x p matrix of predictor variables (observed) Is the p x 1 vector of fixed and unknown parameters ( not observed. Is the n x 1 random vector of error terms (not observed) This is how it looks like in matrix form: linear versus nonlinear model. A linear model is linear in the parameters ( for example). A nonlinear model: brief overview of least squares estimation. Assuming x has rank p, the unique solution is: the fitted values and residuals. P=x (cid:894)x"x(cid:895)-1x is the li(cid:374)ear tra(cid:374)sfor(cid:373)atio(cid:374) represe(cid:374)ti(cid:374)g the orthogo(cid:374)al projectio(cid:374) of (cid:374)- dimensional euclidean space: properties of least squares estimates. The gauss-markov theorem states that is the best linear unbiased estimate (blue) of (i. e. it has minimum variance in the class of unbiased estimators)