Class Notes (836,587)
Canada (509,861)
Sociology (4,081)
SOC350H5 (9)
Lecture 4

SOC350H5 Lecture 4: Lecture 4
Premium

3 Pages
77 Views
Unlock Document

Department
Sociology
Course
SOC350H5
Professor
David Pettinicchio
Semester
Winter

Description
Lecture 4 Bivariate Regression  Start getting at basic property of regression  Basis of doing model for project  Multivariate regression Regression and Levels of Measurement  Regression that was two variable  Dep and indep  CANNOT DO REG UNLESS DEP VARIABLE IS INTERVAL RATIO  Regression – line on graph that shows slop (relationship)  OLS has to have INTERVAL ratio!  Independent variable can be categorical  Gender is not interval ratio  Comparing effects of females against males and predicting outcome = ok  Dummy variables – dichotomous  How to do regression when outcome is binary  Outcome – if dep variable is interval then use regression  Otherwise use logistic analysis Scatterplot  Graph that is associated with reg is scatterplot  Data points on graph  Quick first step  Look at all points on data and see how dispersed they are  Every point in data represents x and y coordinates  The line is supposed to capture general pattern in data  Relationship between paying higher price to save env and country’s wealth  16 cases  Slide 7 slope interpretation:  “Every one unit increase in x (inequality), y increases by .3”  Ordinary least sq – related to line of fit  Y=mx+b  B is slope  A is intercept Best fitting line  Every line can be equated to y=a+bx  Spss giving best fitting line  When it fits line to data  You fit line to reduce total amount of error squared  Sq to accommodate for negative (points under line)  Line of fit is line that reduces total number of error – related to ordinary least sq  Good predictions make least amount of error  When you make predictions using your graph – want to reduce total amount of error sq  Best fitting line has the hat  It could otherwise be any line Errors  OLS has least amount of errors associated with it – residual (left over that is not explained)  Using line of fit formula to predict  Predict for x=20  Actually in graph, x=20 when y=40  But you get y=41.8 with line equation  Y-y = predicting with an error of 1.8 Residual sum of squares  If line of fit has least amount of error – minimizing sum of all error squared  Crappy model = crappy error = crappy prediction  SPS
More Less

Related notes for SOC350H5

Log In


OR

Join OneClass

Access over 10 million pages of study
documents for 1.3 million courses.

Sign up

Join to view


OR

By registering, I agree to the Terms and Privacy Policies
Already have an account?
Just a few more details

So we can recommend you notes for your school.

Reset Password

Please enter below the email address you registered with and we will send you a link to reset your password.

Add your courses

Get notes from the top students in your class.


Submit