RSM318H1 Chapter Notes - Chapter 4: Linear Discriminant Analysis, Royal Institute Of Technology, Likelihood Function

38 views3 pages
7 Mar 2020
School
Department
Course
Professor

Document Summary

We could code categorical response variables to fit linear regression model. Coding implies ordering of categories and magnitudes for differences between outcomes. Coding binary variables for a linear regression can result in probabilities outside of the range. While this provides crude representation for binary variables, does not extend well into qualitative variables with more than two levels. Models probability that y belongs to a particular category. Produces s-shaped curve that will always produce prediction between 0 and 1. Can take any value between 0 and . 0 represents low probability, represents high probability. Seek estimates for coefficients such that corresponds as closely as possible to true. Can use maximum likelihood to fit the model value. Can use logistic regression for greater than two classes. But discriminant analysis is more popular for these applications. 4. 4. 1 using bayes theorem for classification surprisingly unstable. Model x given y: model distribution of predictors x separately in each of the response classes.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents