CMPSC 448 Lecture Notes - Feature Vector, Gradient Descent

225 views1 pages
24 Mar 2014
School
Course
Professor

Document Summary

Cmpsc 448: machine learning and ai: hw 4 (due march 25) Your code will be graded execute). based on correctness on di erent inputs (using python version 2. x): gradient descent k(cid:80) In the following questions, we use the following notation. (cid:126)w = (w1, w2, . , wk) is the weight vector with dimension k. the data is {((cid:126)x1, t1), . Each (cid:126)xj is a feature vector whose components are represented as (cid:126)xj = (xj1, xj2, . We will be using the linear regression model whose prediction for a feature vector (cid:126)xj is (cid:126)w (cid:126)xj = wixji. If tj is the target and (cid:126)w (cid:126)xj is the prediction, we can measure the discrepancy using one-half squared error: f (tj, (cid:126)w (cid:126)xj) = 1. The average error over the training set is then: n(cid:88) n(cid:88) n(cid:88) (cid:33)2 i=1 j=1 j=1 wixji. 2n (tj (cid:126)w (cid:126)xj)2 = (tj w1xj1 w2xj2 wkxjk)2.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers