ADMS 3300 Study Guide - Final Guide: Linear Programming Relaxation, Network Model, Mean Squared Error

203 views5 pages

Document Summary

17 regression line: deterministic: appr. relationship (y = a + bx). Probabilistic: real life; random (err): y=a+bx + . Residuals (error): ei = yi i : different btw point and line (y) y: least squares regression line. 1- be normally distributed (analysis residuals by histogram) 2- have constant variable (no hetersocedasticity | residual xby. 5- no multicollinearity: independent variables are highly x correlated with another. It may distort the t-tests. (not f- test) e e. R2% of variation in y is explained by the xy variation in x. Reject h0, enough evidence to conclude that model is valid. Multiple r = | r square = 1 (sse residual / sst) Adjust r square = 1 (mse residual/(sst/df total)) Total t stat = coefficient / standard error (each variable) Model (coefficient): = b0 + b1x1 + b2x2 + + bkxk. Chapter 19: model building (+ with b ) First order model (p=1): = b0 + b1x.