ECON 6306 Lecture Notes - Lecture 7: Dependent And Independent Variables, General Linear Model, Coefficient Of Determination

33 views2 pages
Multivariate Regression
Previously, we considered that y variable
was affected only by the change in one x
variable. In reality, that is almost never
the case. Many x variables are jointly
affecting the dependent variable. Starting
off, we run into a problem. We are trying
to measure the effect of x on y. But it is
almost always the case that the x
variables are also correlated among each
other. In the figure to the right, the top
left diagram shows this phenomena. Both
x1 and x2 not only affect y but also each
other. The ideal scenario for us is given in
the bottom left where there is no
correlation between the two x variables.
An extreme case is given in the bottom
right where x1 and x2 are strongly
correlated. This can be a problem in our regression model and is called multicollinearity. The good news is that
the problem of correlation and multicollinearity is fixable through our regression models.
Partial regression coefficients
Even if there is correlation between x1 and x2, the regression coefficients can be calculated. We can get the
effect of x1 on y while controlling for the effect of x2 on y. This means that the estimate we get for the slope
related to x1 is independent of the value of x2. In other words, for any value of x2, the effect of x1 stays
constant. R does this for us automatically.
Interpreting coefficients
To run a multivariate regression model, use the following snippet of code as an R script.
model1 <- lm(Petal.Length~Sepal.Width+Sepal.Length, data = iris)
summary(model1)
find more resources at oneclass.com
find more resources at oneclass.com
Unlock document

This preview shows half of the first page of the document.
Unlock all 2 pages and 3 million more documents.

Already have an account? Log in

Document Summary

Previously, we considered that y variable was affected only by the change in one x variable. In reality, that is almost never the case. Many x variables are jointly affecting the dependent variable. We are trying to measure the effect of x on y. But it is almost always the case that the x variables are also correlated among each other. In the figure to the right, the top left diagram shows this phenomena. Both x1 and x2 not only affect y but also each other. The ideal scenario for us is given in the bottom left where there is no correlation between the two x variables. An extreme case is given in the bottom right where x1 and x2 are strongly correlated. This can be a problem in our regression model and is called multicollinearity. The good news is that the problem of correlation and multicollinearity is fixable through our regression models.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents

Related Questions