CEE 5244 Lecture Notes - Lecture 61: Parity Function, Decision Tree Learning

13 views2 pages

Document Summary

Lecture 61 (cid:1) decision tree provides expressive representation for learning discrete-valued function. But they do not generalize well to certain types of boolean functions (cid:2) example: parity function: Class = 1 if there is an even number of boolean attributes with truth value = true. Class = 0 if there is an odd number of boolean attributes with truth value = true (cid:2) for accurate modeling, must have a complete tree (cid:1) not expressive enough for modeling continuous variables. Particularly when test condition involves only a single attribute at-a-time (cid:1) decision boundary (cid:1) oblique decision trees (cid:1) tree replication (cid:1) model evaluation (cid:1) metrics for performance evaluation. How to compare the relative performance among competing models? (cid:1) model evaluation (cid:1) metrics for performance evaluation. How to evaluate the performance of a model? (cid:1) methods for performance evaluation. How to obtain reliable estimates? (cid:1) methods for model comparison.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents