Textbook Notes (280,000)

CA (170,000)

York (10,000)

OMIS (50)

OMIS 2010 (30)

Alan Marshall (20)

Chapter 16

# OMIS 2010 Chapter Notes - Chapter 16: Regression Analysis, Interval Estimation, Royal Guelphic Order

by OC214870

School

York UniversityDepartment

Operations Management and Information SystemCourse Code

OMIS 2010Professor

Alan MarshallChapter

16This

**preview**shows pages 1-2. to view the full**7 pages of the document.**Chapter 16: Sample Linear Regression and Correlation

-Regression analysis: used to predict value of one variable on basis of other variables

-Dependent variable: variable to be forecast

-Independent variable: variable that practitioner believes are related to dependent variables (x1, x2, xk)

-Correlation analysis: determines whether relationship exists

Model

-Deterministic models: equation that allow us to determine value of the dependent variable from values of

independent variables

-Probabilistic Model: method to represent randomness

-e is the error variable: accounts for all variables, measurable and immeasurable, that are not part of the

model

oIts value varies from one â€˜saleâ€™ to the next even if â€˜xâ€™ remains constant

-First-Order Linear Model (simple linear regression model): y (dependent variable) = B0 (y-int) â€“ (y-int)

B1x(independent variable) + e (error variable)

-X and y must be interval

Estimating the Coefficients

-Draw random samples from population of interest

-Calculate sample statistics to estimate B0 and B1

-Estimators based on drawing straight line though sample data; least squares line: comes closest to sample

data points

oY-hat (predicted/fitted value of y) = b0 +b1x

oB0 and b1 calculated so that sum of squared deviations is minimized

oY-hat on average comes closest to observed values of y

oLeast squares method: produces straight line that minimizes the sum of the squared difference

between the points and the line

o(b0) and (b1) are unbiased estimators of B0 and B1

oResiduals: deviations between the actual data pints and the line, ei

oEi = yi â€“ y-hat

ï‚§Residuals are observations of the error variable

Only pages 1-2 are available for preview. Some parts have been intentionally blurred.

ï‚§Minimized sum of squared deviation called sum of squares for error (SSE)

ï‚§Residuals are differences between observed values of y1 and y hat

-Note: we canâ€™t determine value of y-hat for value of x that is far outside the range of sample values of x

Error Variable: Required Conditions

-Required conditions for the Error Variable

1. Probability distribution of e is normal

2. Mean of the distribution is 0: Eâ‚¬ = 0

3. Standard deviation of e is sigma e, which is constant regardless of value of x

4. Value of e associate with any particular value of y is independent of e associated with any

other value of y

-For 1, 2, 3: for each value of x, y is a normally distributed random variable with mean: E(y) = B0 + B1x, with

standard deviation of sigma-e

oMean depends on x, std deviation constant for all values of x

oFor each x, y is normally distributed with same standard deviation

Observational and Experimental Data

-Objective is to see how independent variable is related to dependent variable

-When data is observational, both variables are random variable (donâ€™t need to specify which is

dependent and which is not)

-Two variables must be bivariate normally distributed

Assessing the Model

-Least squares method produces best straight line

oStill may not be any relationship or nonlinear relationship between the two variableâ€™s

-Standard error of the estimate: t-test of the slope and coefficient of determination

Sum of Squares for Error

-Least squares method determines coefficient that minimize sum of squared deviations between the points

and the line defined by the coefficients

Standard Error of Estimate

-If sigma e is large, some of the errors will be large

###### You're Reading a Preview

Unlock to view full version