Computer Science 4442A/B Lecture Notes - Lecture 4: Decision Boundary, Joule, Random Variable

39 views8 pages

Document Summary

K nearest neighbour classifier: the simplest classier on earth, we can implement knn on matlab. Easy to implement for multiple classes knn: multiple classes: easily implemented. Example for k = 5: example, k = 5. 3 fish species: salmon, sea bass, eel: 3 fish species: salmon, sea bass, eel, 3 sea bass, 1 eel, 1 salmon => classified as sea bass. 3 sea bass, 1 eel, 1 salmon classify as sea bass length lightness: choosing k knn: how to choose k, if infinite samples, larger k means better classification, but all k neighbours have to be close. In practice, k = 1 is often used for efficiency, but can be sensitive to noise . [2 110] classify [1 100: classify [1 100] =(cid:187) (cid:188: [1 100] is misclassified, the denser the samples, the less of this problem, but we rarely have samples dense enough, the denser the samples, less the problem we see here that [1 100] is misclassified.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers