BNAD 276 Lecture Notes - Lecture 10: Linear Regression, Johnson & Johnson, Null Hypothesis

43 views8 pages

Document Summary

Introductory case: analyzing the winning % in baseball. Sports analysts frequently quarrel over what statistics separate winning teams form the losers. We will fit three regression models and use the statistical significance of the predictors to help decide. With two explanatory variables to choose from, we can formulate three linear models: Model 1: win = 0 + 1ba + . Model 2: win = 0 + 1era + . Model 3: win = 0 + 1ba + 2era + . A smaller se and a higher r2 means a better model. I(cid:374) ge(cid:374)e(cid:396)al, (cid:449)e (cid:272)a(cid:374) test (cid:449)hethe(cid:396) j is =, >, o(cid:396) < so(cid:373)e h(cid:455)pothesized (cid:448)alue j0. This test could have one of three forms: (cid:455) = 0 + 1x1 + 2x2 + . The appropriate test statistic is tdf = (cid:3116) Sbj is the standard error of the estimator bj. The test statistic will follow a t-distribution with degrees of freedom, df = n k 1.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers