QBUS3820 Lecture Notes - Lecture 7: Dependent And Independent Variables, False Positives And False Negatives, Conditional Independence
QBUS3820: Machine Learning and Data
Mining in Business
Lecture 7: Classification I
Semester 1, 2018
Discipline of Business Analytics, The University of Sydney Business School
find more resources at oneclass.com
find more resources at oneclass.com
Module 10: Classification I
1. Brief review of Lasso and Ridge
2. Classification
3. Review of the Bayes’ rule
4. Introduction to decision theory for classification
5. K-nearest neighbours classifier
6. Na
¨
ıve Bayes classifier
7. Model evaluation for binary classification
2/55
find more resources at oneclass.com
find more resources at oneclass.com
Brief review of Lasso and Ridge
find more resources at oneclass.com
find more resources at oneclass.com
Document Summary
Discipline of business analytics, the university of sydney business school. Module 10: classi cation i: brief review of lasso and ridge, classi cation, review of the bayes" rule, introduction to decision theory for classi cation, k-nearest neighbours classi er, na ve bayes classi er, model evaluation for binary classi cation. The lasso (least absolute shrinkage and selection operator) method solves the following problem nxi=1(cid:16)yi 0 b lasso = argmin where is a tuning parameter. pxj=1. The ridge regression method solves the following problem: nxi=1(cid:16)yi 0 b ridge = argmin pxj=1. 2 j where is a tuning parameter. Best subset, ridge, and lasso when predictors are orthonormal. I(cid:16)|b ols b ols b ols j )(|b ols sign(b ols b lasso j =b ols b lasso j =b ols b lasso j = 0 m |(cid:17) | is one of the k largest |b ols. 2 and if j .