CISC483 Lecture Notes - Lecture 25: Absolute Difference, Data Mining

34 views3 pages

Document Summary

As the tree is built, construct a linear model for the instances at each internal node as well as for leaf nodes. Compute the value predicted by the leaf node. Each node contains a regression formula of the form w0+w1a1+ +wkak, where the ai are attributes and the wi are weights. At each interior node, consider pruning the regression formula by removing terms: Estimate the error rate at the node n, using the formula below: where n = # of training instances that reach node n v = # of attributes in the linear regression model for the node. En is the average of the absolute difference between predicted value and actual value for the training instances that retch node n. Compare this with a revised error estimate enr that we get if we remove a term from the regression formula at node n. Greedily remove terms as long as the resultant error rate is lower.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents