COMP 4106 Lecture Notes - Lecture 6: The Algorithm, Liquid Oxygen, Binary Logarithm

31 views5 pages

Document Summary

A dt as a (cid:862)co(cid:374)(cid:272)ept rep(cid:396)ese(cid:374)tatio(cid:374)(cid:863) fo(cid:396) de(cid:272)idi(cid:374)g to pla(cid:455) te(cid:374)(cid:374)is: classified example by sorting it through the tree to the appropriate leaf, return the classification associated with this leaf (here: yes or no, ex. Looks at how mixed the training samples are ex. In all calculations involving entropy, 0 log 0 is considered 0: suppose s a collection of 14 examples of some boolean concept. If the collection contains unequal numbers of positive and negative examples, the entropy is between 0 and 1: entropy has an information theoretic interpretation. It specifies the minimum number of bits of information needed to encode the classification of an arbitrary member of s: this is if a member of s drawn at random with uniform probability. If p+ = 1, the receiver knows the drawn example will be positive: so no message need be sent, and the entropy is zero.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers